A transient dimension is any field used as a dimension that will change during a user's life time. Examples include:
These fields may affect the Count of Users. For example, consider Table with the following setup:
Consider then a user who on one day leveled up from User Level 2 to User Level 3. In the Reporting Table in question, that user will be counted in both User Level 2 and User Level 3 for that day. Then, if a Widget is created using that Reporting Table with the following setup:
In this case, as the User Level dimension is not used in the Widget, the Count of Users is summed up over that dimension, and the example user will be counted twice - once for User Level 2 and once for User Level 3. This will result in incorrect Count of Users. Adding User Level as a Dimension in the widget will give the correct Count of Users per User Level per day.It is important to pay attention to the Dimensions of the Reporting Table which is used for Count of User in any Widget. To get a correct Count of Users, the Dimensions of the Reporting Table need to be appropriate. Especially, in order to get the correct total Count of Users for an Activity Date, no transient Dimensions can be used in the Reporting Table.
If you're reporting table is not return the data you would expect, verify that the options below have been checked.
A reporting table must have the Run daily at 00:00 UTC button checked to process data every night.
A reporting table needs to include the necessary event types for the data set to return the expected results.
Note: When you add new Event Types to an existing Data Model, reporting tables may have to be updated to include the new Event Type.
Measures may not return data if they have conditions that are scoped to specific event types, or the current data set does not meet those conditions.
Examples: Count of Spenders with a condition of om_event_type = om_revenue. If no revenue events were sent, the Count of Spenders would be 0.
Additionally, some common causes of reporting tables not running correctly include:
Omniata recommends creating many simple reporting tables over one or two large, complex reporting tables with all the dimensions.
It’s more computing and storage space efficient to have a Reporting Table that corresponds to what dimensions and measures a Widget actually needs than trying to build complex Reporting Table to power all Widgets. If you have very granular dimensions, it’s even more important and efficient to have multiple tables.
We refer to Reporting Tables that have a lot of columns as wide tables and and Reporting Tables that have less columns as narrow tables. One issue of the wide tables is the size of the data, i.e. number of the rows in the reporting table. Having a lot of rows in a reporting table will make the widgets load slower, and eventually with even more rows, cause widgets to fail to fetch the data in a reasonable time. The exact number of rows that is supportable cannot be defined, but there are guidelines on dimension design.
You can estimate the scale of the number of rows in table by thinking about the count of distinct values in each dimension, in a boolean dimension the count would be two, and in a user ID column it would be equal to the DAU of the application. The number of rows in a Reporting Table scales with the count of distinct values for each dimension.In the case of Reporting Table having Activity Date and a boolean field the count would be in the scale of 365 * 2 rows per year.
With a 1,000,000 DAU game, a reporting table having the user ID and Activity Date would be in the scale of 365 * 1,000,000 rows per year. Adding on top of that table (e.g. dimension user experience having values 0 - 100,000), there’d be billions of rows annually.
As an example, imagine two widgets:
Widget A requiring a granular dimension X
Widget B requiring granular dimension Y
Now there are two options for the Reporting Tables, either to create a table having X & Y as the dimensions or to create two tables, one with X and the other having Y. In the first case there are X * Y rows in the table, in the second case there are two tables with X + Y rows in total, which is much less. Omniata nightly update process is highly optimized for scalability. One important feature is that having multiple tables doesn’t really affect the performance, since the tables are processed in parallel, i.e. events are scanned only was independent of the number of tables. Thus the approach of having multiple tables but narrower is much more powerful.
Additional benefits of multiple tables approach are that it’s much easier to understand the idea behind a narrow table by just looking at its definition versus a definition of a wide table, also it’s less likely to make errors when creating narrow tables than when creating wide tables.
A custom timestamp can be passed as a parameter on an event_type to be used for creating custom session lengths or for getting more accurate times for reports when batching events or from players who were offline.
Lifetime Value is a projected value for the total amount a user will spend between when they enter an ecosystem and when they cease to be active. Omniata’s LTV model allows the daily projection of user revenues up to 360 days. At the same time, it can be broken out by Country, Project, Acquisition Date and Publisher by default. LTV is an important metric because deriving a numeric value for future revenues should be the backbone of marketing strategy. Knowing how much a user will ultimately be worth helps determine if ad spend is ROI positive, what countries are profitable to acquire users in, and can aid in projecting revenue. It can also be used to evaluate whether changes to a product were beneficial from a revenue standpoint.
Go to lookup table upload csv page.
Add api/v1/ before /lookup_tables (we are versioning it here so upgrades to our upload process will not break your upload)
This will be the URL for your request.
Prepare CSV, you may want to check that CSV uploads successfully manually before testing API.
Get API Token, found on /users/edit
In the same folder as the CSV run the following command, remember to update YOUR_API_KEY_IN_ACCOUNT_SETTINGS
test.csv is the name of the CSV file I'm uploading here. You can adjust as needed.
curl -v -i -H 'Authorization: Token token="YOUR_API_KEY_IN_ACCOUNT_SETTINGS"' -X POST -F "lookup_table_csv_upload[delete_existing_records]=true" -F "lookup_table_csv_upload[csv_file]=@test.csv" https://example.panel.omniata.com/api/v1/lookup_tables/123-items/upload_csv
If you only wish to upload new records, get rid of -F "lookup_table_csv_upload[delete_existing_records]=true"
If you get a http status 200, that means the file was uploaded successfully. We will send a different status in case of errors.