Home/All articles/gemini-hong-kong
The HK AI Stack

How to Get Gemini in Hong Kong: Vertex AI, Workspace, and the Gotchas

Hong Kong AI Podcast/2026-03-07/6 min read/GeminiGoogleVertex AIGoogle WorkspaceHong Kong

Google's Gemini AI — including Gemini 2.5 Pro and Flash — isn't straightforwardly available in Hong Kong. The consumer Gemini app is blocked, the direct Gemini API is blocked, and the Google AI Pro plan is blocked. These restrictions are set by Google, not by Hong Kong authorities. (Source: Google AI Available Regions)

But there are paths that work. Each comes with landmines nobody warns you about.

What's Blocked, What's Not

Let's be precise:

ProductAvailable in HK?Notes
Gemini app (consumer chat)NoRequires VPN
Gemini API (direct, ai.google.dev)NoBlocked by IP geolocation (Source: Google AI Forum)
Google AI Pro planNoBlocked in HK, despite being a paid plan
Google AI StudioNoSame restrictions as direct API
Vertex AI (Google Cloud)YesDifferent terms, enterprise path
Google Workspace (paid)YesGemini features in Gmail, Docs, etc.
PoeYesThird-party aggregator with Gemini access (see our Poe article)

The key distinction: Google blocks its consumer and developer AI products in HK, but its enterprise cloud products (Vertex AI, Workspace) operate under different terms and are available. (Source: MJPM HK)

Path 1: Vertex AI (For Developers and Teams)

What It Is

Vertex AI is Google Cloud's AI platform. It provides API access to Gemini models through Google Cloud infrastructure — no VPN, no workarounds, fully official.

If you have a Google Cloud project, you can call Gemini through the Vertex AI API endpoint. This is the developer path: you get programmatic access to Gemini 2.5 Pro, Gemini 2.5 Flash, and the full model family. (Source: Google Cloud Vertex AI Locations)

How to Set It Up

1. Create a Google Cloud project (or use an existing one) 2. Enable the Vertex AI API 3. Set up billing (you need a payment method on file) 4. Choose your region — this matters more than you think (see below) 5. Call the API using Google's client libraries or the REST API

The API is OpenAI-compatible if you use the right endpoint format, so tools like Cursor, LangChain, and most AI frameworks can connect.

The Regional Rate Limit Gotcha

Here's something buried so deep in Google's docs that you probably can't find it. And if you do find it, good luck understanding it.

Rate limits on Vertex AI can apply to an entire region, not just your individual project.

That means you can wake up, run zero requests, and already be rate-limited because other users in the same region have consumed the quota. You'll get 429 errors and have no idea why — because you didn't do anything.

The fix: choose a region with lower traffic. US regions tend to have the most capacity. Asia regions (like asia-southeast1 in Singapore) are closer geographically but may have tighter shared quotas. You might need to experiment — or ask for a quota increase through the Google Cloud console.

The Cost Premium

Vertex AI pricing is typically 20-40% higher than the direct Gemini API — roughly $1.50-1.75 per million input tokens vs. $1.25 for direct access. You're paying for the enterprise wrapper and the ability to access Gemini from HK at all. Gemini 2.5 Flash is still cheap (fractions of a cent per request). (Source: AI Free API — Gemini Regional Restrictions)

The Documentation Problem

Google's AI documentation is sprawling, fragmented, and often out of date. Vertex AI docs alone span multiple products (AI Studio, Model Garden, Agent Builder, Vertex AI Search, Gemini API, PaLM API legacy...) and it's genuinely hard to find the specific page you need.

If you're stuck, try:

  • -The Vertex AI Gemini API quickstart — not the general Vertex AI docs
  • -Honestly, asking another AI model to parse Google's docs for you

Path 2: Google Workspace (For Personal and Business Use)

What It Is

If you have a paid Google Workspace account — that's Gmail for business, starting at $7.20/user/month — you get access to Gemini through Google's built-in apps.

This includes:

  • -Gemini chat integrated into Google's apps
  • -Veo 3 (Google's video generation model) through the built-in video app
  • -AI features in Gmail, Docs, Sheets, and Slides

Why This Matters

This is the consumer/business path. No API, no code, no cloud project. You pay for Google Workspace, you get Gemini. It works in Hong Kong because Workspace is globally available.

For individuals who want Gemini without setting up cloud infrastructure, this is the simplest option. The catch: it's limited to Google's own apps and interfaces. You can't build custom applications on top of it — and you can't use it as an API endpoint.

Important distinction: The Google AI Pro plan (the consumer Gemini subscription) is blocked in HK. Google Workspace (the business subscription) is not. They look similar but are different products with different availability.

How to Access

1. Sign up for Google Workspace (workspace.google.com) 2. Gemini features are included in most paid plans 3. Access Gemini through Google apps — no additional setup

Path 3: Poe (The Side Door)

Poe (by Quora) is an AI aggregator that provides access to Gemini, Claude, GPT-4, and other models through a single interface. It works in Hong Kong without restrictions.

This isn't an "official" Google path, but it's a legitimate third-party service. If you just need to chat with Gemini occasionally and don't need API access, Poe is the path of least resistance. See our detailed Poe article for more.

Comparison

FactorVertex AIGoogle WorkspacePoe
Use caseDevelopment, API accessProductivity, chatQuick chat, comparison
Access to modelsFull APIGemini in Google appsGemini + Claude + GPT
Setup complexityMediumLowNone
CostUsage-based (20-40% premium)Per-user subscriptionFree tier / $19.99/mo
Custom applicationsYesNoNo
Rate limit risksRegional shared quotasStandard limitsMessage quotas

How This Compares to Other "Loophole" Access

Hong Kong developers have a pattern of finding indirect paths to models that aren't officially available:

  • -GPT-4/5: Available through Microsoft Copilot and Azure OpenAI (see our article on Microsoft Copilot: The Compliant Loophole)
  • -Claude: Available through Amazon Bedrock, Poe, and some third-party platforms
  • -Gemini: Available through Vertex AI and Google Workspace (this article)

The irony: getting access isn't the hard part anymore. The hard part is navigating each platform's documentation, pricing model, and regional quirks. Every path works, but every path has its own set of gotchas.

For teams that need Gemini specifically — maybe for its long context window, its multimodal capabilities, or its integration with Google Cloud services — Vertex AI is the way. Just budget some time for the setup, and don't be surprised when the rate limiter hits you for someone else's traffic.


Sources

Navigating AI access in Hong Kong? Subscribe to the Hong Kong AI Podcast for practical guides from people who've been through it.

Stay in the loop

Get notified when we publish new articles and episodes. No spam, just signal.

Something out of date or wrong? AI moves fast and we want to get it right. Let us know at contact@hongkongaipodcast.com