Alooma can read and replicate all of the data in files from a Google Cloud Storage bucket. This allows you to, for example, merge arbitrary data from Google Cloud Storage with client usage data in your data warehouse.
Files in your Google Cloud Storage bucket can be packaged and compressed using Tar or GZip, and can contain a variety of different file formats.
Add your Google Cloud Storage input from Alooma's plumbing screen
Click the button to authenticate with your Google credentials
Give your input a name. The name will appear in the plumbing screen, and will name the events that the input emits.
Define your Google Cloud Storage input source:
Project Name (that the storage is associated with). Learn how to find your project name.
Bucket Name: (Bucket names must contain between 3 and 63 characters. Names containing dots can contain up to 222 characters, but each dot-separated component is 63 characters or less. Learn how to manage a bucket.
File prefix (optional - the full path prefix of the files you'd like to upload, for instance
mydir/sys_log). If a prefix is entered, files that don't match it are ignored.
Define which files to import:
All files will pull all the data from your container (for a given prefix or regex, if defined above), and will continue to do so forever.
A date span pulls files written between the given dates, inclusive. If you select the date span option and don't specify dates, the input will only pull files written after the input was created.
Define the format of the files in your Google Cloud Storage from the list of our available file formats.
Keep the mapping mode to the default of OneClick if you'd like Alooma to automatically map the Google Cloud Storage file events directly to your target data warehouse. Otherwise, they'll have to be mapped manually from the Mapper screen.
If there are any null bytes in rows in your data, those rows will be discarded and will not appear in the Restream Queue.
That's it! You're ready to import your data from Google Cloud Storage into Alooma!