Google Cloud Console | Cloud Storage, Cloud Console | Business Continuity Cloud | Arcserte

Arcerve Cloud Console

Public access column at the level of the bucket

Google Cloud Console

Use Google Cloud Console to perform simple storage management management tasks for Cloud Storage. Here are some current cases of the Google Cloud console:

  • Activate the Cloud Storage API for a project
  • Create and delete buckets
  • Import, download and delete objects
  • Manage authentication and access management strategies (IAM)

This page presents the Google Cloud console and certain tasks that you can accomplish with its help to manage your data. To perform more elaborate tasks, use Google Cloud CLI or one of the client libraries compatible with Cloud Storage.

Test

If you start on Google Cloud, create an account to assess Cloud Storage in real conditions. New customers also benefit from $ 300 of credits offered to execute, test and deploy workloads.

Access the Google Cloud console

The Google Cloud console does not require any configuration or installation, and you can access it directly from a browser. Depending on your use case, access to the Google Cloud console is a slightly different way. If you are :

A user authorized to access a project

The current project owner can give you access to the entire project, which applies to all buckets and objects defined in the project.

A user authorized to access a bucket

Use https: // console.cloud.Google.Com/Storage/Browser/Bucket_Name .

In this case of use, the owner of the project gives you access to a unique bucket within a more important project. It then transmits the name of the bucket that you substitute in the URL above. You can only work with specified bucket objects. This solution is practical for users who do not have access to the complete project, but who must access a bucket. When you access the URL, you are invited to authenticate yourself with a Google account if you are not already connected.

Possible variant of this use case: when the owner of a project grants the right of access to reading to objects of a bucket to all the users. A bucket whose content is read publicly is then created. For more information, see the Define Authorizations and Metadata of objects below.

A user authorized to access an object

Use https: // console.cloud.Google.Com/Storage/Browser/_Details/Bucket_Name/Object_name .

In this case of use, the owner of the project gives you access to specific objects within a bucket and sends you the URL allowing access to these objects. When you access the URL, you are invited to authenticate yourself with a Google account if you are not already connected.

Note that the format of the above URL is different from the URL of objects that are publicly shared. When you share a publicly link, the URL format is this: https: // storage.googleapis.com / bucket_name / object_name . This public URL does not require the recipient that he approves with Google and can be used for unauthenticated access to an object.

Tasks you can do on Google Cloud Console

The Google Cloud console allows you to perform basic tasks of storing your data storage in a browser. To use the Google Cloud console, you must authenticate yourself with Google and have appropriate authorizations allowing you to perform a given task. If you are the owner of the account that has created the project, it is likely that you already have all the authorizations necessary to perform the tasks below. Otherwise, you may be authorized to access a project or be authorized to carry out actions on a bucket.

Create a bucket

Cloud Storage uses flat names space to store data. However, you can use the Google Cloud console to create files and imitate a tree structure. Your data is not physically stored in a hierarchical structure, but presented in this form in the Google Cloud console.

As Cloud Storage has no notion of file, the file suffix and object name delimitors are only visible if you consult your files using GCloud CLI or any other command line compatible with Cloud Storage.

For a stage guide on the creation of buckets, see the Create Buckets page.

Import data into a bucket

Importing data into a bucket is carried out by transferring one or more files or a folder containing files. When you import a folder, the Google Cloud console keeps the hierarchical structure of the folder, including all the files and subfolders it contains. A progression window allows you to follow your imports on the Google Cloud console. You can reduce this window to continue using the bucket.

You will find a steps guide on the import of objects in buckets using the Google Cloud console on the Import objects page.

You can also import objects into the Google Cloud console by draining files and files concerned from your computer or file manager to a Bucket or a Cloud Console subdossier.

Noticed : The transfer of folders via the shutter of the Google Cloud console by drain and drop is only possible by using the Chrome browser. On the other hand, you can transfer one or more files to the Google Cloud console by this means with any browser.

Download data from a bucket

You will find a stage guide on downloading objects from buckets using the Google Cloud console on the page Download objects.

You can also click on an object to display all the information. If it is not possible to display an object, the information page includes an overview of the object itself.

Noticed : If you want to download several objects at the same time, use GCLOUD RAI instead.

Create and use folders

The Cloud Storage system having no file concept, the files created in the Google Cloud console are a practical way to organize objects in a bucket. To help you visually, the Google Cloud console displays files with a file icon to distinguish them from objects.

Objects added to a file seem to reside in the Google Cloud console folder. In reality, all objects exist at the level of the bucket and simply include the repertoire structure in their name. For example, if you create a folder called Pets and add a CAT file.JPEG in this folder, the Google Cloud console reveals the file as if it were in the folder. In reality, there is no separate folder entity: the file simply exists in the bucket under the name PETS/CAT.jpeg .

Unlike buckets, files should not necessarily be unique. In other words, a bucket name can only be used if there are no other buckets under the same name, but the file names can be used repeatedly as long as these files do not reside in the same bucket or subfolder.

When you browse the files in the Google Cloud console, you can access the upper levels of the directory by clicking on the name of the folder or the bucket sought in the Ariane wire located above the file lists.

When you work with other tools on your buckets and data, the presentation of the files may be different from that displayed in the Google Cloud console. To find out more about how different tools, such as GCloud Cli, simulate files in Cloud Storage, see the Files page.

Filter buckets or items to display

In the Google Cloud console, the project of a project of a project allows you to filter those displayed using the text box Filter buckets.

  • You can always filter by prefix of the Bucket name.
  • For projects with less than 1,000 buckets, you can still use additional filter criteria, such as thelocation Buckets.
  • For projects comprising more than 1,000 buckets, you must activate additional filter criteria using the drop -down menu located next to the filtering text box. However, note that the activation of additional filter criteria on projects with thousands of buckets degrades filter performance.

In the Google Cloud console, the list of objects for a bucket, allows you to filter the objects that are displayed by specifying a prefix in the text box Filter by object or prefix of file name. , Located above the object list. This filter displays objects starting with the specified prefix. The prefix only filters the objects of the bucket which are currently displayed: the objects contained in the files are not selected.

Define object metadata

You can configure the metadata of an object in the Google Cloud console. Object metadata control the aspects relating to the method of processing requests, including the type of content and the data storage mode. The Google Cloud console makes it possible to define the metadata of an object both. Use GCloud Storage Objects Update to define the metadata of several objects simultaneously.

You will find a stage guide on the display and modification procedure of the metadata of an object on the display page and modify object metadata.

Noticed : It is not possible to define metadata on a file.

Remove objects, files and buckets

To delete a bucket, a folder or an object in Google Cloud Console, check the corresponding box, click the button DELETE And confirm that you want to continue the operation. When you delete a file or a bucket, you also delete all the objects it contains, including the objects marked as public.

You will find a stage guide on the removal of objects from your buckets using the Google Cloud console on the delete objects page.

To find out how to delete buckets from a project using the Google Cloud console, refer to the Delete Buckets page.

Share data publicly

Noticed : You cannot publicly share the data stored in buckets for which public access protection is applied.

When you share a publicly object, a link icon appears in the column public access object. When you click on this link, a public URL allowing access to the object is displayed.

Note: the public URL is different from the link associated with the right click on an object. The two links allow access to an object, but the use of the public URL works without it being necessary to connect to a Google account. See the Termination Points page page to find out more.

To find out how to access a publicly shared object, see the page access public data.

To stop sharing an object in public mode:

You can stop sharing an object publicly by deleting authorization entries with the main accounts Alleusers (all users) or Aluthenticatedusers (all authenticated users).

  • For the buckets on which you publicly share certain objects, modify the LCA of each object.
  • For the buckets on which you publicly share all objects, delete IAM access for Allusers.

Use the public access column

Buckets and objects of the Google Cloud console include a column ofpublic access which indicates under what circumstances the resources are shared publicly.

Public access column at the level of the bucket

The public access column of a bucket can take the following values: Internet, Non -public Or Subject to objects.

A bucket is considered to be Internet If he has an IAM role meeting the following criteria:

  • The role is assigned to the main account Alleusers Or Aluthenticatedusers.
  • The role contains at least one storage authorization other than Storage.buckets.Create or Storage.buckets.list .

If these conditions are not met, the bucket is either Non -public, either Subject to objects ::

  • Non -public : No IAM role gives public access to bucket objects, and uniform access to the bucket is activated for this.
  • Subject to objects : No role IAM gives public access to the objects of the Bucket. However, the access control lists (LCA) allow public access to individual objects in the bucket. Check the authorizations of each object to identify if they give public access. To use exclusively IAM, you must activate uniform access to the bucket level.

Public access column at the object level

An object is considered to be public if one or the other of the following conditions is met:

  1. The LCA (Access Control List) of the object includes an input Alleusers Or Aluthenticatedusers.
  2. The bucket containing the object has an IAM role meeting the following criteria:
    • The role is assigned to the main account Alleusers Or Aluthenticatedusers.
    • The role has at least one of the following storage authorizations: Storage.objects.Get, storage.objects.Getiampolicy, storage.objects.Setiampolicy, Storage.objects.update .

If one of these conditions is met, the public access column of the object indicates Internet.

If none of these conditions is met, the public access column of the object indicates Non -public.

Define permissions on a bucket

You can control access to a Cloud Storage Bucket using IAM authorizations. For example, you can define authorizations on a bucket to allow an entity, such as a user or a group, to display or create objects in the bucket. This process is useful when adding a user to the project level is not appropriate. The entity specified in the IAM authorization must authenticate by connecting to Google when accessing the bucket. Share the Bucket URL with the user (s) in this format: https: // console.cloud.Google.Com/Storage/Browser/Bucket_Name/ .

Define authorizations on an object

IAM authorizations in the Google Cloud console allow you to easily and uniform control access to the objects of a bucket. If you want to personalize access to certain objects of a bucket, use the signed URLs or LCAs (access control lists) instead.

You will find a stage guide on the display and modification of IAM authorizations on the Page Use IAM authorizations.

To display or modify permissions on individual objects, see the Modify LCA section.

Noticed : It is not possible to define authorizations on a file.

Assign roles to users at the project level

When you create a project, the IAM role Owner is assigned to you. To be able to use the buckets and objects of your project, the other entities, such as employees, must be assigned their own roles.

Once a role has been assigned to you in the project, the project name appears in your project list. If you own an existing project, you can give main account access to the project. You will find a stage guide on the addition and deletion of access to the project level on the Manage access to projects, files and organizations.

Noticed : As far as possible, define a minimum level of authorizations while granting the necessary access to team members. For example, if the team member needs only to read the objects stored in a project, select the authorization Reader of objects in the storage space. Likewise, if he needs to have total control of the objects (but not buckets) of a project, select Director of storage space objects.

Use the management of objects of objects

You can activate the management of objects of objects to maintain archived versions of an object in the event of abolition or accidental replacement. However, activation of this functionality increases storage costs. You can also reduce costs by adding objects management conditions of objects when you activate the management of objects of objects. These conditions automatically delete or demarcate the old versions of objects according to the parameters you specify. The example of configuration for the deletion of objects provides a set of possible conditions for this case of use.

The archived versions of an object are listed and managed in the tab Version history object.

Analyze buckets with sensitive data protection

The protection of sensitive data is a service that allows you to identify and protect sensitive data in your buckets. The protection of sensitive data can help you meet compliance requirements by looking for and masking information such as:

  • Credit card numbers
  • IP addresses
  • Other types of information to personally identify users

To obtain the list of data types detected by the protection of sensitive data, consult the reference documentation of the infotypes of infotypes.

You can launch a sensitive data protection analysis for a bucket by clicking on the three -point menu from the bucket and selecting Analyze with sensitive data protection. For advice on analyzing a bucket with the sensitive data protection tool, see the Inspect a Cloud Storage location.

Comment

Unless otherwise indicated, the content of this page is governed by a Creative Commons Assignment 4 license.0, and code samples are governed by a Apache 2 license.0. For more information, see the rules of the Google Developers site. Java is a registered trademark of Oracle and/or its affiliated companies.

Last update on 2023/08/09 (UTC).

Arcerve Cloud Console

Avoid computer disasters wherever you are, from your applications and systems, in your sites and clouds.

Customer profile Average and large companies
Partner profile Added reseller (Var)

Respect all your RTO, RPO and SLA agreements without the complexity of several tools and interfaces

The Cloud quickly changes the way companies save data, but most continue to use X86 non -compatible platforms such as UNIX, HP / UX and Aix for the management of traditional applications. With such multi -generational IT infrastructure, companies increase the risk of data loss and extended service interruption, which is due to several factors: the complexity of the management of primary, secondary and tertiary data centers, exploitation, Disaster Recovery protocols and colocation facilities.

ArcServe’s continuity Cloud continuity Cloud allows you to get out of the 21st century computer labyrinth by combining powerful backup, recovery after sinister, high availability and e-mail archiving technologies for a complete solution that eliminates service interruptions and losses data wherever you are, from your applications and systems, in your sites and clouds.

Save in productivity and recover up to 50 % additional time. Eliminate the deficiencies of your strategy of continuity of activities with a single solution. Protect each byte from a single management console.

A unique solution with flexible rules for your systems, storage types and applications

Operating with a unified and cloud -based management interface, ArcServe Business Continuity Cloud technology offers you full protection for your entire computer ecosystem:

Prevent service interruptions and data loss of complex multigenerational IT infrastructure thanks to the only integrated, native and Cloud data protection solution;

Restore the SLA agreements and ensure your RTO and RPO, whether it is seconds or hours;

Automatically test and validate your recovery capacity and provide granular reports to the stakeholders engaged in data protection;

Move safely moves to large volumes of data to and from the Cloud without weakening the bandwidth;

Immediately restore access to critical systems and applications after a breakdown or a disaster, including ransomical attacks;

Easily modulate and pay according to your growth without adding additional tools or management interfaces;

Support compliance and regulations by simplifying legal research and compliance audits;

Protect your computer processing with multi-cloud and cross-cloud data protection.

With this solution, ArcServe solves the challenges of the protection of IT systems. Time -related challenges, skills, expenses and multiple tools necessary to protect new workloads.

Anna Ribeiro Computer Technology Review

How it works

ArcServe Business Continuity Cloud makes a full range of technologies accessible to meet current and future IT needs. You are migrating your workloads to a public or private cloud ? We protect you. Need advanced hypervisor protection ? There she is. Need to get RTO and RPO lower than a minute ? It is also possible with us.

Unlike punctual tools that cannot protect all your systems and applications, ArcServe offers technologies to take care of the three generations of IT platforms, UNIX and X86 to public and private clouds.

These powerful technologies are accessible via a single online management console, offering you a transparent user experience to protect your business. No need to switch between screens or manage several products with different SLAs.

Reduce your total cost of possession (TCO) and save up to 50 % additional time. Of a unique location, you manage the entire data protection process. Everything you need, we have it.

ArcServe advantage

Ease of use and optimal design

ArcServe Business Continuity Cloud offers a large set of most complete technologies, with a simplified user experience. This is why you can manage all its robust features in a few clicks within a single management interface from anywhere in the world.

Supported by a very competent product support

Our teams are among the most experienced in the data protection sector; Expertise that we integrate to help our customers and partners.

Advantageous for network partners

Network partners can adopt and manage ArcServe Business Continuity Cloud technologies to help customers who have limited resources, but requiring a higher level of protection for crucial systems.

There is a tendency in industry that tends to unification. ArcServe does a good job thanks to its capacity for integration and to be able to collect everything – from on -premise to protection in SaaS mode – all within a modern user interface which is easy to use.

Thanks! You've already liked this
No comments