gcp-cloud-functions

Build serverless functions on Google Cloud Functions. Deploy HTTP and event-driven functions triggered by Pub/Sub, Cloud Storage, and Firestore. Configure runtime settings, manage dependencies, and connect to other GCP services.

26 stars

Best use case

gcp-cloud-functions is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Build serverless functions on Google Cloud Functions. Deploy HTTP and event-driven functions triggered by Pub/Sub, Cloud Storage, and Firestore. Configure runtime settings, manage dependencies, and connect to other GCP services.

Teams using gcp-cloud-functions should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/gcp-cloud-functions/SKILL.md --create-dirs "https://raw.githubusercontent.com/TerminalSkills/skills/main/skills/gcp-cloud-functions/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/gcp-cloud-functions/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How gcp-cloud-functions Compares

Feature / Agentgcp-cloud-functionsStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Build serverless functions on Google Cloud Functions. Deploy HTTP and event-driven functions triggered by Pub/Sub, Cloud Storage, and Firestore. Configure runtime settings, manage dependencies, and connect to other GCP services.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# GCP Cloud Functions

Google Cloud Functions is a serverless execution environment for building event-driven applications. Write single-purpose functions that respond to HTTP requests, Pub/Sub messages, Cloud Storage events, or Firestore changes — no infrastructure to manage.

## Core Concepts

- **HTTP Function** — triggered by HTTP requests, returns a response
- **Event Function** — triggered by cloud events (Pub/Sub, Storage, Firestore)
- **Gen 2** — latest version, built on Cloud Run, longer timeouts, concurrency
- **Trigger** — the event source that invokes the function
- **Runtime** — language environment (Node.js, Python, Go, Java, etc.)

## HTTP Functions

```javascript
// index.js — HTTP function that processes webhook payloads
const functions = require('@google-cloud/functions-framework');

functions.http('handleWebhook', (req, res) => {
  const { event, data } = req.body;

  if (req.method !== 'POST') {
    return res.status(405).send('Method not allowed');
  }

  console.log(`Received event: ${event}`, data);

  switch (event) {
    case 'order.created':
      // Process new order
      res.json({ status: 'processed', orderId: data.id });
      break;
    default:
      res.json({ status: 'ignored', event });
  }
});
```

```bash
# Deploy an HTTP function (Gen 2)
gcloud functions deploy handle-webhook \
  --gen2 \
  --runtime nodejs20 \
  --region us-central1 \
  --source . \
  --entry-point handleWebhook \
  --trigger-http \
  --allow-unauthenticated \
  --memory 256Mi \
  --timeout 60 \
  --set-env-vars "NODE_ENV=production"
```

## Pub/Sub Triggered Functions

```python
# main.py — process Pub/Sub messages
import base64
import json
import functions_framework

@functions_framework.cloud_event
def process_message(cloud_event):
    """Triggered by a Pub/Sub message."""
    data = base64.b64decode(cloud_event.data["message"]["data"]).decode()
    message = json.loads(data)

    print(f"Processing order: {message['order_id']}")

    # Process the order
    result = fulfill_order(message)
    print(f"Order {message['order_id']} fulfilled: {result}")
```

```bash
# Deploy Pub/Sub triggered function
gcloud functions deploy process-order \
  --gen2 \
  --runtime python312 \
  --region us-central1 \
  --source . \
  --entry-point process_message \
  --trigger-topic order-events \
  --memory 512Mi \
  --timeout 120 \
  --set-secrets "DATABASE_URL=db-url:latest"
```

## Cloud Storage Triggered Functions

```python
# main.py — process uploaded files
import functions_framework
from google.cloud import storage, vision

@functions_framework.cloud_event
def process_upload(cloud_event):
    """Triggered when a file is uploaded to Cloud Storage."""
    data = cloud_event.data
    bucket_name = data["bucket"]
    file_name = data["name"]

    if not file_name.lower().endswith(('.jpg', '.png', '.jpeg')):
        print(f"Skipping non-image file: {file_name}")
        return

    print(f"Processing image: gs://{bucket_name}/{file_name}")

    # Generate thumbnail, run OCR, etc.
    client = storage.Client()
    bucket = client.bucket(bucket_name)
    blob = bucket.blob(file_name)
    image_data = blob.download_as_bytes()

    # Process image...
    print(f"Processed {file_name} ({len(image_data)} bytes)")
```

```bash
# Deploy Storage triggered function
gcloud functions deploy process-upload \
  --gen2 \
  --runtime python312 \
  --region us-central1 \
  --source . \
  --entry-point process_upload \
  --trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
  --trigger-event-filters="bucket=my-uploads-bucket" \
  --memory 1Gi \
  --timeout 300
```

## Firestore Triggered Functions

```javascript
// index.js — react to Firestore document changes
const functions = require('@google-cloud/functions-framework');
const { Firestore } = require('@google-cloud/firestore');

functions.cloudEvent('onUserCreated', async (cloudEvent) => {
  const data = cloudEvent.data;
  const newValue = data.value.fields;

  const email = newValue.email.stringValue;
  const name = newValue.name.stringValue;

  console.log(`New user created: ${name} (${email})`);

  // Send welcome email, create default settings, etc.
  const db = new Firestore();
  await db.collection('user-settings').doc(data.value.name.split('/').pop()).set({
    theme: 'light',
    notifications: true,
    createdAt: Firestore.FieldValue.serverTimestamp()
  });
});
```

```bash
# Deploy Firestore triggered function
gcloud functions deploy on-user-created \
  --gen2 \
  --runtime nodejs20 \
  --region us-central1 \
  --source . \
  --entry-point onUserCreated \
  --trigger-event-filters="type=google.cloud.firestore.document.v1.created" \
  --trigger-event-filters="database=(default)" \
  --trigger-event-filters-path-pattern="document=users/{userId}" \
  --memory 256Mi
```

## Managing Functions

```bash
# List deployed functions
gcloud functions list --gen2 --region us-central1
```

```bash
# View function details
gcloud functions describe process-order --gen2 --region us-central1
```

```bash
# View logs
gcloud functions logs read process-order --gen2 --region us-central1 --limit 50
```

```bash
# Delete a function
gcloud functions delete process-order --gen2 --region us-central1
```

## Local Development

```bash
# Run function locally
npx @google-cloud/functions-framework --target=handleWebhook --port=8080
```

```bash
# Test locally with curl
curl -X POST http://localhost:8080 \
  -H "Content-Type: application/json" \
  -d '{"event":"order.created","data":{"id":"12345"}}'
```

## Best Practices

- Use Gen 2 for all new functions (better performance, concurrency support)
- Set memory and timeout based on actual needs — don't over-provision
- Use Secret Manager for credentials, not environment variables
- Implement idempotent handlers — events may be delivered more than once
- Use structured logging for better observability in Cloud Logging
- Set min-instances to avoid cold starts for latency-sensitive functions
- Use the functions framework for local development and testing
- Keep functions focused — one function, one purpose

Related Skills

step-functions

26
from TerminalSkills/skills

You are an expert in AWS Step Functions, the serverless orchestration service for building workflows as state machines. You help developers coordinate Lambda functions, API calls, and AWS services using visual workflows with branching, parallel execution, error handling, retries, and human approval steps — building reliable, observable distributed systems without custom orchestration code.

hetzner-cloud

26
from TerminalSkills/skills

Manage Hetzner Cloud infrastructure from the terminal. Use when a user asks to create a Hetzner server, manage VPS instances, set up firewalls, configure networks, manage volumes, create snapshots, handle SSH keys, or provision infrastructure on Hetzner. Covers the hcloud CLI for all resource types. For deploying applications on top of Hetzner servers, see coolify.

gcp-cloud-storage

26
from TerminalSkills/skills

Manage Google Cloud Storage for scalable object storage. Create and configure buckets, upload and organize objects, generate signed URLs for secure temporary access, set lifecycle rules for cost optimization, and configure access control.

gcp-cloud-sql

26
from TerminalSkills/skills

Provision and manage Cloud SQL instances on Google Cloud for MySQL, PostgreSQL, and SQL Server. Configure high availability, read replicas, automated backups, IAM database authentication, the Cloud SQL Auth Proxy, and Terraform deployments. Use for managed relational databases on GCP.

gcp-cloud-run

26
from TerminalSkills/skills

Deploy serverless containers on Google Cloud Run — services for HTTP traffic, jobs for batch and scheduled tasks, and worker pools for always-on pull-based background processing. Build and push container images, configure auto-scaling from zero, split traffic for canary deploys, and set up custom domains with managed TLS.

gcloud

26
from TerminalSkills/skills

Google Cloud CLI for managing GCP resources. Use when the user needs to work with Compute Engine, Cloud Storage, Cloud Functions, IAM, GKE, and other Google Cloud services from the terminal.

cloudflare-workers

26
from TerminalSkills/skills

Assists with building and deploying applications on Cloudflare Workers edge computing platform. Use when working with Workers runtime, Wrangler CLI, KV, D1, R2, Durable Objects, Queues, or Hyperdrive. Trigger words: cloudflare, workers, edge functions, wrangler, KV, D1, R2, durable objects, edge computing.

cloudflare-vectorize

26
from TerminalSkills/skills

Serverless vector database at the edge with Cloudflare Vectorize. Use when: building semantic search on Cloudflare Workers, RAG pipelines at the edge, low-latency vector similarity search, or storing and querying embeddings without managing a separate vector database.

cloudflare-ai

26
from TerminalSkills/skills

You are an expert in Cloudflare Workers AI, the serverless AI inference platform running on Cloudflare's global network. You help developers run LLMs, embedding models, image generation, speech-to-text, and translation models at the edge with zero cold starts, pay-per-use pricing, and integration with Workers, Pages, and Vectorize — enabling AI features without managing GPU infrastructure.

cloud-resource-analyzer

26
from TerminalSkills/skills

Finds orphaned, idle, and underutilized cloud resources across AWS, GCP, or Azure accounts. Use when someone needs to audit cloud spending, find unused EBS volumes, stale snapshots, unattached IPs, idle load balancers, or oversized RDS instances. Trigger words: cloud waste, orphaned resources, unused volumes, cloud audit, infrastructure cleanup, cloud bill analysis.

azure-functions

26
from TerminalSkills/skills

Build serverless applications with Azure Functions. Create HTTP and event-driven functions with input/output bindings, configure triggers for queues, timers, and blob storage. Use Durable Functions for stateful orchestration workflows.

aws-cloudfront

26
from TerminalSkills/skills

Configure Amazon CloudFront for global content delivery. Set up distributions with S3 and ALB origins, define cache behaviors and TTLs, invalidate cached content, and use Lambda@Edge for request/response manipulation at the edge.