Once you've connected your input, you'll want to make sure it's running. There are a few ways to do this:
Navigate to the Live tab to see the events arriving in the system.
Navigate to the Mapper to see the event type(s) created as a result of connecting the input.
Navigate to the Dashboard to see the incoming rate or notifications that may indicate a problem with the input.
Click on the input node to see the incoming event rate.
Depending on the type of input, you may expect events to appear immediately (SaaS services, transactional databases, stores like S3 and Azure) or the input may be waiting for you to send events to the system or perform an external integration (any SDK, webhook). If the input is waiting for you, make sure to send events to the system and then verify that it is working using the above methods.
Yes! Most types of inputs can be paused via the Pause Input button on the top of the Input node in the Plumbing screen. Click again to resume.
When resuming a paused input there can be a delay as the system "catches up" with data. For some input types configured to replicate via log replication like MySQL, PostgreSQL, Oracle, or MongoDB, a long pause can result in errors if the log position is lost. You'll see a warning dialog if you try to pause an input where this situation could happen. Also note that this feature is not available for SDKs, Mixpanel, or Localytics.
If you're already familiar with columnar data warehouses such as Redshift, you know that they do not support single upserts or merge commands — they're oriented around appending data, rather than updating existing data.
This means that if you are replicating a MySQL row or Salesforce entry, for example, and change a value of a field, replicating results in having two values stored for the same field. The only way to distinguish the two would be the timestamp at which the values were stored. This also applies to fields that have values that are deleted — the original value would still be in the data warehouse, as well as a new entry designating this value as deleted.
This could create problematic results with your queries and would require you to add complicated logic to ensure you only query the most recent value.
Since many of our customers prefer a one-to-one replica of their data sources without duplicate and differing values for their fields rather than an append-only option, Alooma performs an efficient consolidation process within the data warehouse that gives an exact one-to-one replica of the data sources without introducing performance degradation to your queries.
By doing so, you can query your data in “human” real time (this means usually in a few minutes, not milliseconds).
If the integration you want isn't supported, you can check whether that data source supports pushing data via webhook. Many SaaS services support webhooks, and that means you can add that data source via our custom webhook integration. If you need any help discerning whether this support is available, talk to us!
Alooma is constantly expanding its support for integrations and targets. If you have an integration you'd like supported which isn't currently, please contact us! That integration may already be in development and there may be opportunities to join a beta group.
What do you mean "if"? :) We know that data changes — schemas grow, shrink, and change. We've got your back! Configure your Mapper field settings to automatically adapt to changes, or come in and fix according to changes and restream.
You've created your integration and now it needs to be edited — what to do? We're working on supporting editing, but in the meantime, reach out to our support team.