Introduction to Deploying Node.js on Google Cloud Platform
Deploying applications efficiently and reliably is a critical aspect of modern software development. Leveraging cloud services such as Google Cloud Platform (GCP) can significantly streamline this process. This is especially true when combined with continuous integration and continuous deployment (CI/CD) practices. In this guide, we’ll show you how to configure Node js CICD with GCP. We’ll utilize Bitbucket Pipelines for CI/CD. Docker will be used for containerization.
Prerequisites for Node.js Deployment on GCP
Before we dive into the deployment steps of Node js CICD, ensure you have the following prerequisites:
- GCP Account: You need a Google Cloud Platform account. If you don’t have one, sign up at GCP.
- Bitbucket Account: Create a Bitbucket account if you don’t have one at Bitbucket.
- Node.js Application: Have a Node.js application ready for deployment.
- Docker: Install Docker on your local machine. Follow the instructions on the Docker website.
- Google Cloud SDK: Install the Google Cloud SDK by following the instructions at Google Cloud SDK.
Creating the bitbucket-pipelines.yml
File for Node.js Deployment
The bitbucket-pipelines.yml
file defines the CI/CD pipeline in Bitbucket. Here’s a sample configuration for building and deploying a Docker image of your Node.js application for Node js CICD:
image: google/cloud-sdk:latest
pipelines:
default:
- step:
script:
- echo "Starting CI/CD pipeline..."
- docker build -t gcr.io/$GCR_PROJECT_ID/my-nodejs-app .
- echo $GOOGLE_APPLICATION_CREDENTIALS | base64 --decode > $BITBUCKET_CLONE_DIR/credentials.json
- gcloud auth activate-service-account --key-file=$BITBUCKET_CLONE_DIR/credentials.json
- gcloud config set project $GCR_PROJECT_ID
- gcloud builds submit --tag gcr.io/$GCR_PROJECT_ID/my-nodejs-app
- gcloud run deploy $CLOUD_RUN_SERVICE_NAME --image gcr.io/$GCR_PROJECT_ID/my-nodejs-app --region $CLOUD_RUN_REGION --platform managed --allow-unauthenticated
This configuration uses the Google Cloud SDK Docker image to build the Docker image of your Node.js application, submit it to Google Container Registry, and deploy it to Cloud Run.
Adding Environment Variables to Bitbucket for GCP Deployment
Next, you need to add environment variables to your Bitbucket repository settings. These variables will store sensitive information and configurations for Node js CICD deployment:
- Navigate to your Bitbucket repository.
- Go to Repository settings > Pipelines > Environment variables.
- Add the following environment variables:
CLOUD_RUN_REGION
: Your Cloud Run region (e.g.,us-central1
).GCR_PROJECT_ID
: Your Google Cloud Project ID.GOOGLE_APPLICATION_CREDENTIALS
: Your service account credentials (in base64 encoded format).CLOUD_RUN_SERVICE_NAME
: Your desired Cloud Run service name.
Creating Service Accounts and Generating Keys in GCP
Service accounts are crucial for securely automating interactions with Google Cloud services. For this task, we need two service accounts. One is for the Bitbucket web hook, and the other is for the Cloud Run service. Here’s how to create these service accounts and configure their roles:
- Go to the GCP Console:
- Open the GCP Console.
- Navigate to Service Accounts:
- Go to IAM & Admin > Service accounts.
- Create Service Account for Bitbucket Web Hook:
- Click Create Service Account.
- Provide a name and description, e.g.,
bitbucket-webhook-service-account
. - Click Create.
- Assign the following roles:
- Storage Admin
- Cloud Run Admin
- Artifact Registry Writer
- Click Done.
- Create Service Account for Cloud Run Service:
- Click Create Service Account.
- Provide a name and description, e.g.,
cloud-run-service-account
. - Click Create.
- Assign the following roles:
- Cloud Run Admin
- Cloud SQL Client
- Click Done.
- Generate Keys:
- For each service account, select the account from the list.
- Click Keys > Add Key > Create New Key.
- Choose JSON and download the key file.
- Encode the Key File to Base64:
- Use the following command to encode the key file to base64:bashCopy code
base64 <key-file.json>
- Use the following command to encode the key file to base64:bashCopy code
- Add Encoded String to Environment Variables in Bitbucket:
- Copy the base64 encoded string.
- Add it to the
GOOGLE_APPLICATION_CREDENTIALS
environment variable in Bitbucket.
Follow these steps to create the necessary service accounts with the appropriate roles. Generate keys to securely automate the deployment process or Node js CICD pipeline.
Building a Docker Image for Node.js in Bitbucket
Using a multi-stage Dockerfile can help optimize your Docker images by reducing their size and improving build efficiency. This approach is particularly useful for Node.js applications. Here’s how to create a multi-stage Dockerfile for your Node.js project:
Multi-Stage Dockerfile
Create a Dockerfile
in your Node.js project directory:
# Stage 1: Build
FROM node:14 AS build
# Create and change to the app directory
WORKDIR /usr/src/app
# Copy application dependency manifests to the container image.
COPY package*.json ./
# Install dependencies
RUN npm install
# Copy local code to the container image.
COPY . .
# Build the application (if needed)
# RUN npm run build
# Stage 2: Run
FROM node:14
# Create and change to the app directory
WORKDIR /usr/src/app
# Copy only the necessary files from the build stage
COPY --from=build /usr/src/app .
# Install only production dependencies
RUN npm install --only=production
# Run the web service on container startup.
CMD [ "node", "index.js" ]
This Dockerfile
sets up a build stage to install dependencies. It builds your application. Then, it creates a final image with only the necessary files and production dependencies.
For more details on creating multi-stage Dockerfiles, refer to this article that provides an in-depth guide.
Creating an SQL Service Instance and Database in GCP
For applications that require a database, you can use Google Cloud SQL. Here’s how to create an SQL service instance and database:
- Go to the GCP Console: GCP Console.
- Navigate to SQL.
- Click Create instance.
- Choose the desired database engine (e.g., MySQL).
- Configure the instance (ID, password, region, etc.) and click Create.
- Once the instance is created, click on it to manage it.
- Go to Databases and click Create database. Enter a name for your database.
Deploying Cloud Run Service from Docker Image
First, build your Docker image and push it to Google Container Registry. Then, you can deploy it to Cloud Run using the Google Cloud Console. Follow these steps:
- Navigate to Cloud Run:
- Go to the Google Cloud Console.
- Select your project.
- Navigate to Cloud Run from the left-hand menu under Compute.
- Click on “Service” in the “DEPLOY CONTAINER” Dropdown:
- Click Service to start creating a new Cloud Run service.
- Configure the Service:
- Service configuration:
- Name: Enter a name for your service, e.g.,
my-nodejs-service
. - Region: Choose the region where you want to deploy your service.
- Name: Enter a name for your service, e.g.,
- Service configuration:
- Container image:
- Location: Select Google Container Registry (GCR).
- Region: Choose the region where your Docker image is stored.
- Image: Enter the path to your Docker image in GCR, e.g.,
gcr.io/your-project-id/my-nodejs-app
.
- Advanced Settings:
- Connections:
- Add connections: Click Add connection to add a connection to a Cloud SQL instance.
- Cloud SQL instance: Select your Cloud SQL instance.
- Authenticator: Choose the authentication method (e.g., Service account).
- Database name: Enter the name of your database.
- Connection name: Use the socket path for connecting to the database.
- Connections:
- Revision autoscaling:
- Maximum requests per container: Configure the maximum number of requests that each container instance can handle.
- Container concurrency: Set the maximum number of concurrent requests that each container instance can process.
- CPU allocation: Adjust the CPU allocation based on your application’s resource needs.
- Memory allocation: Specify the amount of memory allocated to each container instance.
- Deploy: Click Deploy to deploy your service with the specified configuration.
- Access URL: Once deployed, you will receive an access URL for your Cloud Run service. Use this URL to access your deployed Node.js application.
Explanation of Revision Autoscaling
Cloud Run offers autoscaling capabilities to handle varying traffic loads efficiently. Here’s a brief explanation of the key autoscaling settings:
- Maximum requests per container: This specifies the maximum number of requests a single container instance can process. After reaching the limit, it is shut down and replaced.
- Container concurrency: Defines the maximum number of concurrent requests that each container instance can handle.
- CPU allocation: Adjusts the amount of CPU resources allocated to each container instance based on your application’s CPU requirements.
- Memory allocation: Specifies the amount of memory allocated to each container instance to ensure optimal performance.
By configuring these settings, you can ensure that your Node.js application deployed on Cloud Run scales seamlessly to handle increased traffic while optimizing resource usage.
Conclusion: Successfully Deploying Node.js on GCP
In this guide, we covered the entire process of deploying a Node.js application on Google Cloud Platform using Bitbucket Pipelines for CI/CD and Docker for containerization. We started by setting up the environment. We created necessary configuration files. Then we built and deployed Docker images. Finally, we connected the application to a MySQL database.
By following these steps, you can leverage the power of GCP and CI/CD. This will streamline your development and deployment processes. It ensures reliable and scalable application delivery. If you have any questions or feedback, feel free to leave a comment below. Happy coding!
Share the article with your friends
Pingback: Environment Variables in CodeBuild of AWS - Equitem