Ir al contenido principal

Entradas

Mostrando las entradas etiquetadas como Google Cloud

Unique bucket names in Google Cloud Storage

One interesting thing about Google Cloud Storage buckets, that maybe not all people is aware of, is that their name is unique at Google Cloud Storage level (and not at project level, as most of us would assume). That means that a bucket name must be unique at global level, and that is a problem as some of the names we choose may already exist. The way Google Cloud recommends is creating the names of the buckets as subdomains of an owned domain. Google ensures that it will verify the domain property before creating the bucket. In that way nobody will be able to create buckets using our domain and, therefore, our buckets will become unique at Google Cloud level. For multisite projects, it is also a good idea to add the bucket location to the name, so equivalent buckets could coexist in different locations. Examples of that exposed would be: mybucket.mydomain.net for a global "mybucket" bucket associated to our domain "mydomain.net" mybucke...

Defining Google Cloud IAM conditions for Secret Manager roles

Defining conditions for the permissions granted to a Google Cloud service account helps to enforce our security policy. By defining conditions we can, for instance, specify not just that a given account can access to secrets, but also to what secrets. This is really important since, in case an attacker could take control of a compute resource associated to an account with read access to secrets, he would literally be able to read all our secrets. However, if a condition is applied to that permission, just the secrets matching the condition would be exposed. In order to define an IAM permission condition, it is necessary to access to the Google Cloud IAM administration console and editing the principal (service account) whose permissions must be conditioned. Then by clicking the "ADD CONDITION" label of the role whose permissions must be conditioned, we access to the condition definition view, that contains both the Condition Builder that allows us defining c...

Debugging Google Cloud Functions with event signature in Node

In the article Debugging Google Cloud Functions in Node we exposed a first approach of what Google Functions Framework was and how could it be used. The article was mainly focused on debugging Google Cloud Function with HTTP signature, ie, those functions intended to be triggered by an HTTP request. There are however other ways of triggering the execution of a Google Cloud Function, as for instance Cloud Storage , Firestore or, more commonly, by forwarding a Pub/Sub message . This article covers how to debug Google Cloud functions intended to be triggered by a Pub/Sub message. Google Cloud functions with cloud event signature Let's assume we have a function that must be triggered by a Pub/Sub message. To do that, we need to perform two actions: Properly configuring Google Cloud to specify that our function will be triggered by a given kind of Pub/Sub message. This part is described in the official documentation and it's out of the scope of this article...

Understanding Google Cloud Tasks Timing

Google Cloud Tasks offer a good solution for controlling workflows in our backend infastructure, increasing load peaks resilency while keeping costs controlled. Such a powerful solution requires a careful configuration to properly work, specially when dealing with task retrying policy. This feature is basically controlled by a set of properties described in the Google Cloud documentation and quickly described here: minBackoff : Elapsed time in seconds, from the initial task execution, that defines the lower limit of the incremental retry time window. The first retry will occur minBackoff seconds after the initial task execution. maxBackoff : Elapsed time in seconds, from the initial task execution, that defines the higher limit of the incremental retry time window. After reaching maxBackoff the retry time will not increment, and retries will occur every maxBackoff seconds. maxDoublings : Number of times that minBackoff will be doubled (multiplied by two) in order to ...

GCloud: Changing the default project (in a nutshell)

gcloud command line tool allow us managing a list of Google Cloud projects, one of which is considered as the default project, so all the operations we perform via gcloud are performed on it. Available projects To check what projects are available via gcloud , run the command: $ gcloud projects list This command drops an output similar to this example: PROJECT_ID NAME PROJECT_NUMBER fooOrg-pr01 Project Bar 239785793387 fooOrg-pr02 Project Zaz 348328582382 ... You can check the available options in the GCloud SDK reference . Current default project As you can see, that output doesn't set what is the default project. To get it run this command: $ gcloud config get-value project The command output will show the ID of the default project. Following the above example, the output of the command could be fooOrg-pr01 to state that the default project is our Project Bar. Changing the current default project To change the default project, run this comman...

Google Cloud Storage: List files with wildcards

One of the most fustrating things when using Google Cloud Storage is not being able to search files using wildcards: Google Cloud web interface only allows stating a file prefix, what in many cases is not enough. Luckly, Google Cloud SDK gsultil ls , also available throught the Google Cloud Console, allows listing storage bucket contents using wildcards. Let's see some examples: Listing all the available buckets gsutil ls Listing a specific bucket content gsutil ls gs://my_bucket Listing, in a specific bucket, all the files matching a classic wilcard pattern As usual, * symbol matches zero or more characters, ? symbol matches only one character. gsutil ls gs://my_bucket/foo*.json gsutil ls gs://my_bucket/*bar.json gsutil ls gs://my_bucket/foo*bar.json gsutil ls gs://my_bucket/foo*.jso? Listing, in a specific bucket, content matching advanced wilcard pattern A set of characters into brackets matches all the filenames that has one of those characters. For i...

Debugging Google Cloud Functions in Node

DISCLAIMER: This article focuses on Google Cloud Functions with HTTP signature, ie, those functions intended to be triggered by an HTTP request. In case of being interested in Cloud Functions being triggered by Cloud Storage, File Storage or Pub/Sub events, read Debugging Google Cloud Functions with event signature in Node . Cloud Functions, the Function as a Service (FaaS) product of Google Cloud, is an attractive resource for those developing solutions in cloud environments. Similar in conception to AWS Lambdas or Azure Functions, they offer serverless processing at an affordable price. From the developer point of view, they don't differ too much of a standard function contained in a program developed in Node, Python, Go or Java, except for debugging: Due to their ubiquitous nature debugging is not easy and most of the times is done following one of these two approaches: Uploading a debug version, running it and then checking its output (logging and/or funct...

gcloud: Run cloud functions from command line

Tired of that slow web interface that Google Cloud Console offers to those lazy programmers out there? Then gcloud comes to your help. Among other miriad of functions, gcloud allows calling GC functions from the command line of your computer. For instance, this command calls the helloWorld function, deployed on the cloud region "europe-west1": gcloud functions call helloWorld --region="europe-west1" Calling  gcloud functions list  is the way to get all the available GC functions, together with their state, trigger and cloud region. Functions input data Most of the times your functions will require some data input to work. Depending on the function triggering method, data can arrive in different ways (in the body of an HTTP request, in a PubSub message...), but they way of sending that data from the gcloud command is always the same: gcloud functions call helloWorld \ --region "europe-west1...