Python
Buckets#
Create / interact with gcloud storage buckets.
- class gcloud.storage.bucket.Bucket(connection=None, name=None, properties=None)[source]#
Bases: gcloud.storage._helpers._PropertyMixin
A class representing a Bucket on Cloud Storage.
Parameters: - connection (gcloud.storage.connection.Connection) – The connection to use when sending requests.
- name (string) – The name of the bucket.
- CUSTOM_PROPERTY_ACCESSORS = {'location': 'location', 'versioning': 'versioning_enabled', 'acl': 'acl', 'defaultObjectAcl': 'get_default_object_acl()', 'etag': 'etag', 'id': 'id', 'timeCreated': 'time_created', 'cors': 'get_cors()', 'metageneration': 'metageneration', 'lifecycle': 'get_lifecycle()', 'storageClass': 'storage_class', 'name': 'name', 'logging': 'get_logging()', 'projectNumber': 'project_number', 'selfLink': 'self_link', 'owner': 'owner'}#
Map field name -> accessor for fields w/ custom accessors.
- configure_website(main_page_suffix=None, not_found_page=None)[source]#
Configure website-related properties.
See: https://developers.google.com/storage/docs/website-configuration
Note
This (apparently) only works if your bucket name is a domain name (and to do that, you need to get approved somehow...).
If you want this bucket to host a website, just provide the name of an index page and a page to use when a key isn’t found:
>>> from gcloud import storage >>> connection = storage.get_connection(project, email, private_key_path) >>> bucket = connection.get_bucket(bucket_name) >>> bucket.configure_website('index.html', '404.html')
You probably should also make the whole bucket public:
>>> bucket.make_public(recursive=True, future=True)
This says: “Make the bucket public, and all the stuff already in the bucket, and anything else I add to the bucket. Just make it all public.”
Parameters: - main_page_suffix (string) – The page to use as the main page of a directory. Typically something like index.html.
- not_found_page (string) – The file to use when a page isn’t found.
- connection[source]#
Getter property for the connection to use with this Bucket.
Return type: gcloud.storage.connection.Connection Returns: The connection to use.
- copy_key(key, destination_bucket, new_name=None)[source]#
Copy the given key to the given bucket, optionally with a new name.
Parameters: - key (string or gcloud.storage.key.Key) – The key to be copied.
- destination_bucket (gcloud.storage.bucket.Bucket) – The bucket into which the key should be copied.
- new_name (string) – (optional) the new name for the copied file.
Return type: Returns: The new Key.
- delete(force=False)[source]#
Delete this bucket.
The bucket must be empty in order to delete it. If the bucket doesn’t exist, this will raise a gcloud.storage.exceptions.NotFound. If the bucket is not empty, this will raise an Exception.
If you want to delete a non-empty bucket you can pass in a force parameter set to true. This will iterate through the bucket’s keys and delete the related objects, before deleting the bucket.
Parameters: full – If True, empties the bucket’s objects then deletes it. Raises: gcloud.storage.exceptions.NotFound if the bucket does not exist, or gcloud.storage.exceptions.Conflict if the bucket has keys and force is not passed.
- delete_key(key)[source]#
Deletes a key from the current bucket.
If the key isn’t found, this will throw a gcloud.storage.exceptions.NotFound.
For example:
>>> from gcloud import storage >>> from gcloud.storage import exceptions >>> connection = storage.get_connection(project, email, key_path) >>> bucket = connection.get_bucket('my-bucket') >>> print bucket.get_all_keys() [<Key: my-bucket, my-file.txt>] >>> bucket.delete_key('my-file.txt') >>> try: ... bucket.delete_key('doesnt-exist') ... except exceptions.NotFound: ... pass
Parameters: key (string or gcloud.storage.key.Key) – A key name or Key object to delete.
Return type: Returns: The key that was just deleted.
Raises: gcloud.storage.exceptions.NotFound (to suppress the exception, call delete_keys, passing a no-op on_error callback, e.g.:
>>> bucket.delete_keys([key], on_error=lambda key: pass)
- delete_keys(keys, on_error=None)[source]#
Deletes a list of keys from the current bucket.
Uses Bucket.delete_key() to delete each individual key.
Parameters: - keys (list of string or gcloud.storage.key.Key) – A list of key names or Key objects to delete.
- on_error (a callable taking (key)) – If not None, called once for each key raising gcloud.storage.exceptions.NotFound; otherwise, the exception is propagated.
Raises: gcloud.storage.exceptions.NotFound (if on_error is not passed).
- disable_logging()[source]#
Disable access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#disabling
- disable_website()[source]#
Disable the website configuration for this bucket.
This is really just a shortcut for setting the website-related attributes to None.
- enable_logging(bucket_name, object_prefix='')[source]#
Enable access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#delivery
Parameters: - bucket_name (string) – name of bucket in which to store access logs
- object_prefix (string) – prefix for access log filenames
- etag[source]#
Retrieve the ETag for the bucket.
- See: http://tools.ietf.org/html/rfc2616#section-3.11 and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string
- classmethod from_dict(bucket_dict, connection=None)[source]#
Construct a new bucket from a dictionary of data from Cloud Storage.
Parameters: bucket_dict (dict) – The dictionary of data to construct a bucket from. Return type: Bucket Returns: A bucket constructed from the data provided.
- get_all_keys()[source]#
List all the keys in this bucket.
This will not retrieve all the data for all the keys, it will only retrieve the keys.
This is equivalent to:
keys = [key for key in bucket]
Return type: list of gcloud.storage.key.Key Returns: A list of all the Key objects in this bucket.
- get_cors()[source]#
Retrieve CORS policies configured for this bucket.
Return type: list(dict) Returns: A sequence of mappings describing each CORS policy.
- get_default_object_acl()[source]#
Get the current Default Object ACL rules.
If the acl isn’t available locally, this method will reload it from Cloud Storage.
Return type: gcloud.storage.acl.DefaultObjectACL Returns: A DefaultObjectACL object for this bucket.
- get_key(key)[source]#
Get a key object by name.
This will return None if the key doesn’t exist:
>>> from gcloud import storage >>> connection = storage.get_connection(project, email, key_path) >>> bucket = connection.get_bucket('my-bucket') >>> print bucket.get_key('/path/to/key.txt') <Key: my-bucket, /path/to/key.txt> >>> print bucket.get_key('/does-not-exist.txt') None
Parameters: key (string or gcloud.storage.key.Key) – The name of the key to retrieve. Return type: gcloud.storage.key.Key or None Returns: The key object if it exists, otherwise None.
- get_lifecycle()[source]#
Retrieve lifecycle rules configured for this bucket.
- See: https://cloud.google.com/storage/docs/lifecycle and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: list(dict) Returns: A sequence of mappings describing each lifecycle rule.
- get_logging()[source]#
Return info about access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#status
Return type: dict or None Returns: a dict w/ keys, logBucket and logObjectPrefix (if logging is enabled), or None (if not).
- id[source]#
Retrieve the ID for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string
- iterator(prefix=None, delimiter=None, max_results=None, versions=None)[source]#
Return an iterator used to find keys in the bucket.
Parameters: - prefix (string or None) – optional prefix used to filter keys.
- delimiter (string or None) – optional delimter, used with prefix to emulate hierarchy.
- max_results (integer or None) – maximum number of keys to return.
- versions (boolean or None) – whether object versions should be returned as separate keys.
Return type: _KeyIterator
- location#
Retrieve location configured for this bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets and https://cloud.google.com/storage/docs/concepts-techniques#specifyinglocations
Return type: string
- make_public(recursive=False, future=False)[source]#
Make a bucket public.
Parameters: - recursive (bool) – If True, this will make all keys inside the bucket public as well.
- future (bool) – If True, this will make all objects created in the future public as well.
- metageneration[source]#
Retrieve the metageneration for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: integer
- new_key(key)[source]#
Given path name (or Key), return a storage.key.Key object.
This is really useful when you’re not sure if you have a Key object or a string path name. Given either of those types, this returns the corresponding Key object.
Parameters: key (string or gcloud.storage.key.Key) – A path name or actual key object. Return type: gcloud.storage.key.Key Returns: A Key object with the path provided.
- owner[source]#
Retrieve info about the owner of the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: dict Returns: mapping of owner’s role/ID.
- project_number[source]#
Retrieve the number of the project to which the bucket is assigned.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: integer
- self_link[source]#
Retrieve the URI for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string
- storage_class[source]#
Retrieve the storage class for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets and https://cloud.google.com/storage/docs/durable-reduced-availability
Return type: string Returns: Currently one of “STANDARD”, “DURABLE_REDUCED_AVAILABILITY”
- time_created[source]#
Retrieve the timestamp at which the bucket was created.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string Returns: timestamp in RFC 3339 format.
- update_cors(entries)[source]#
Update CORS policies configured for this bucket.
Parameters: entries (list(dict)) – A sequence of mappings describing each CORS policy.
- update_lifecycle(rules)[source]#
Update CORS policies configured for this bucket.
- See: https://cloud.google.com/storage/docs/lifecycle and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Parameters: rules (list(dict)) – A sequence of mappings describing each lifecycle rule.
- upload_file(filename, key=None)[source]#
Shortcut method to upload a file into this bucket.
Use this method to quickly put a local file in Cloud Storage.
For example:
>>> from gcloud import storage >>> connection = storage.get_connection(project, email, key_path) >>> bucket = connection.get_bucket('my-bucket') >>> bucket.upload_file('~/my-file.txt', 'remote-text-file.txt') >>> print bucket.get_all_keys() [<Key: my-bucket, remote-text-file.txt>]
If you don’t provide a key value, we will try to upload the file using the local filename as the key (not the complete path):
>>> from gcloud import storage >>> connection = storage.get_connection(project, email, key_path) >>> bucket = connection.get_bucket('my-bucket') >>> bucket.upload_file('~/my-file.txt') >>> print bucket.get_all_keys() [<Key: my-bucket, my-file.txt>]
Parameters: - filename (string) – Local path to the file you want to upload.
- key (string or gcloud.storage.key.Key) – The key (either an object or a remote path) of where to put the file. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
- upload_file_object(file_obj, key=None)[source]#
Shortcut method to upload a file object into this bucket.
Use this method to quickly put a local file in Cloud Storage.
For example:
>>> from gcloud import storage >>> connection = storage.get_connection(project, email, key_path) >>> bucket = connection.get_bucket('my-bucket') >>> bucket.upload_file(open('~/my-file.txt'), 'remote-text-file.txt') >>> print bucket.get_all_keys() [<Key: my-bucket, remote-text-file.txt>]
If you don’t provide a key value, we will try to upload the file using the local filename as the key (not the complete path):
>>> from gcloud import storage >>> connection = storage.get_connection(project, email, key_path) >>> bucket = connection.get_bucket('my-bucket') >>> bucket.upload_file(open('~/my-file.txt')) >>> print bucket.get_all_keys() [<Key: my-bucket, my-file.txt>]
Parameters: - file_obj (file) – A file handle open for reading.
- key (string or gcloud.storage.key.Key) – The key (either an object or a remote path) of where to put the file. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
- versioning_enabled[source]#
Is versioning enabled for this bucket?
See: https://cloud.google.com/storage/docs/object-versioning for details.
Return type: boolean Returns: True if enabled, else False.