This is a summary post of my takeaways from learning Google Could's Gemini for Google Cloud Learnign Path > GenAI Apps with Gemini and Streamlit course as part of ๊ตฌ๊ธ ์คํฐ๋์ผ x ์์ธ (IT's) ์คํฐ๋.


Overview
The "Develop GenAI Apps with Gemini and Streamlit" cousre covers Python SDK, Gemini API, and deploying a Streamlit web app with Could Run.
Objective
- Modify prompt with cURL to test Gemini API
- Write prompt Python code
- Test the Streamlit web app
- Modify the Dockerfile and push the Docker image to the Artifact Registery
- Deploy the application to Cloud Run and test
Task1. Use cURL to test a prompt with the API
In the Google Could console, on the Navigation menu, Verte AI > Workbench, I could test Gemini API in the Jupyter Notebook environment.
By running through each notebook cells with cURL, I was able to interact with the model.
Takeaway
cURL?
Task 2. Write Streamlit framework and prompt Python code to complete chef.py
Using Cloud Shell terminal, I cloned the given Github repo, which I belive is the starter pack to use generative AI on Google Cloud, including sample apps. And I was guided to download the `chef.py`file with `gsutil cp` commands for copying the file.
There was also Cloud Shell Editor (along with Cloud Shell Terminal) on the interface, which is like web-based VScode environment, and I found it was much easier to understand the code and structures than just Terminal. So there, I added the radio button option for users to choose wine preferences, as well as the new Gemini prompt starting to say that "I am a Chef. I need to create recipes..."
Lastly, I uploaded the chef.py file to the bucket by running command `gcloud storage cp chef.py DESTINATION`.
Takeaway
`gsutil` is a CLI for interacting with Google Could Storage. It allows users to perform various operations on buckets and objects with Google Cloud Storage directly from the command.
Task 3. Test the application
To test streamlit application on Cloud Run, I need to setup the python virtual environment and install dependenies.
python3 -m venv gemini-streamlit
source gemini-streamlit/bin/activate
pip install -r requirements.txt
Problem
Next, in order to run the app using Cloud Shell Terminal, I first just typed below command.
streamlit run chef.py
However, the app kept showing loading UIs and sometimes the app was not even opened in the browser, with the below console error message.
Client Error: WebSocket onerror
(It is where I spent most of my time getting stuck, eventually had to re-take final lab a couple more times).
It turned out that I had to run the app as specified in below README.md.
Solution

The solution is to run it with chaging port to 8080 as well as disabling CORS.
Side note, both CORS and XSRF protection option should be turned off toghether to disable one of them, as described in the official Streamlit docs.
Why WebSocket connection error running Streamlit App in Cloud?
First, what is WebSocket?
Streamlit framework uses WebSocket to communicate between Python server and client. It is not using normal HTTP request-response, instead, it establishes WebSocket connection for bidirectional communication. That's why Streamlit renders the updated UI state immediately in real time without requiring a page refresh. When a user interacts with the UI, the event is sent to the server through the WebSocket, the server sends back the updated UI via WebSocket.
Then, why do we need to disable CORS for running streamlit app (even locally) in Cloud Shell environment?
- The Streamlit server normally allows the origin localhost:8501.
- In a Cloud Run container, the default origin is localhost:8080.
- Cloud Run is a container-based serverless service. When Google runs the container, it sends external requests to the container through a proxy. Because of this, the container is set by default to run the HTTP server on port 8080.
Therefore, in the Cloud Run environment, the HTTP server must run on port 8080, so we change the Streamlit app to run with `--server.port=8080`.
However, this causes a mismatch with Streamlit’s default allowed origin (port 8501), which leads to a CORS error. That’s why we disable CORS when changing the port to 8080.
Relationship between CORS errors and WebSocket errors?
If the problem occurs from a CORS issue, then why does Streamlit show only a WebSocket error in the console, instead of expliciitly mentioning CORS?
A WebSocket connection also starts with an HTTP request and verifies headers. For example, the browser sends a request to the server like this:
GET /stream HTTP/1.1
Host: localhost:8080
Upgrade: websocket
Connection: Upgrade
Origin: http://localhost:8080
Sec-WebSocket-Key: abc123...
The server checks `Origin`header. If it is not correct, the server either rejects the connection or responds with 403 Forbidden. For normal HTTP APIs, the browser console would show a "CORS error". However, during the WebSocket handshake, if the origin is rejected, the server just closes the connection or sends 403. The browser then only displays it as WebSocket error, like this.
WebSocket connection to 'ws://localhost:8080/...' failed
At the browser network level, it appears only as a WebSocket error.
In conclusion: A WebSocket error can happen because of a CORS issue. To prevent this, we set the Streamlit app to use port 8080 and disable CORS (--server.enableCORS=false).
WebSocket & CORS Error Flow (Streamlit)
Browser Streamlit Server
| |
|--- HTTP GET /stream -------------->| (Upgrade: websocket, Origin: http://localhost:8080)
| |
| |--- Check Origin header
| | |
| | |-- Origin OK → 101 Switching Protocols → WebSocket established
| | |
| | |-- Origin NOT OK → Reject / 403 Forbidden
| |
|<-- Connection failed -------------| (Browser shows: WebSocket connection error)
Task 4. Modify the Dockerfile and push image to the Artifact Registry
1. Modify Dockerfile's Entrypoint code to make chef.py file entry for the Docker image.
FROM python:3.13-slim
EXPOSE 8080
WORKDIR /app
COPY . ./
RUN pip install --no-cache-dir -r requirements.txt
ENTRYPOINT ["streamlit", "run", "chef.py", "--server.port=8080", "--server.address=0.0.0.0"
2. Set the following environment variables as instructed.
AR_REPO='gemini-repo'
SERVICE_NAME='gemini-streamlit-app'
3. Create the Artifact Registry repository with the `gcloud artifacts repositories create` command.
gcloud artifacts repositories create "$AR_REPO" --location="$REGION" --repository-format=Docker
4. Submit the build with the `gcloud builds submit` command.
gcloud builds submit --tag "$REGION-docker.pkg.dev/$PROJECT/$AR_REPO/$SERVICE_NAME"
This way, docker images are builded and pushed to the Artifact Registry repository.
Task 5. Deploy the application to Cloud Run and test
Deploy the application (as a Docker Artifact) to Cloud Run using `gcloud run deploy` command.
gcloud run deploy "$SERVICE_NAME" \
--port=8080 \
--image="$REGION-docker.pkg.dev/$PROJECT/$AR_REPO/$SERVICE_NAME" \
--allow-unauthenticated \
--region=$REGION \
--platform=managed \
--project=$PROJECT \
--set-env-vars=PROJECT=$PROJECT,REGION=$REGION
Tada!
On successful deployment, you can visit provided URL in the browser.

'Google Cloud Platform' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
| Explore Generative AI with the Gemini API in Vertex AI: Challenge Lab (0) | 2025.09.24 |
|---|---|
| Build a Website on Google Cloud: Challenge Lab (1) | 2025.09.11 |
| Set Up a Google Cloud Network (2) | 2025.08.23 |
| Introduction to Function Calling with Gemini (0) | 2025.08.11 |
| Getting Started with the Gemini API in Vertex AI with cURL / REST API (1) | 2025.08.07 |