Data Modeling FAQs

What are transient dimensions?

A transient dimension is any field used as a dimension that will change during a user's life time. Examples include:

  • Level (Level 1, Level 2, Level 3 etc.)
  • Is Spender (non-spender to spender)

These fields may affect the Count of Users. For example, consider Table with the following setup:

  • Dimensions
    • Activity Date
    • User Level
  • Measures
    • Count of Users

Consider then a user who on one day leveled up from User Level 2 to User Level 3. In the Reporting Table in question, that user will be counted in both User Level 2 and User Level 3 for that day. Then, if a Widget is created using that Reporting Table with the following setup:

  • Dimensions
    • Activity Date
  • Measures
    • Count of Users

In this case, as the User Level dimension is not used in the Widget, the Count of Users is summed up over that dimension, and the example user will be counted twice - once for User Level 2 and once for User Level 3. This will result in incorrect Count of Users. Adding User Level as a Dimension in the widget will give the correct Count of Users per User Level per day.It is important to pay attention to the Dimensions of the Reporting Table which is used for Count of User in any Widget. To get a correct Count of Users, the Dimensions of the Reporting Table need to be appropriate. Especially, in order to get the correct total Count of Users for an Activity Date, no transient Dimensions can be used in the Reporting Table.

Why is my reporting table is not returning the data I expect?

If you're reporting table is not return the data you would expect, verify that the options below have been checked.

Run Daily

A reporting table must have the Run daily at 00:00 UTC button checked to process data every night.


A reporting table needs to include the necessary event types for the data set to return the expected results.

  • If an Event Field or User Field is dependent on a specific Event Type, that Event Type must be included in the scope.
  • Similarly, the Project Environments need to be scoped to the Environment(s) each API Key is sending data to.
  • If choosing specific Event Types does not return data, try expanding the results to All Event Types and All Projects and Environments.

Note: When you add new Event Types to an existing Data Model, reporting tables may have to be updated to include the new Event Type.


Measures may not return data if they have conditions that are scoped to specific event types, or the current data set does not meet those conditions.

Examples: Count of Spenders with a condition of om_event_type = om_revenue. If no revenue events were sent, the Count of Spenders would be 0.


Additionally, some common causes of reporting tables not running correctly include:

  • Not having Activity Date, Project Name, and Project ID in the table
  • Not having any aggregate measures
  • Including an aggregate as a dimension
  • Having an Event Field or Custom User Field with incorrect Lua
  • Having too many fields (See Creating Reporting Tables for more information)

Should I have many simple reporting tables, or few complex reporting tables?

Omniata recommends creating many simple reporting tables over one or two large, complex reporting tables with all the dimensions.

It’s more computing and storage space efficient to have a Reporting Table that corresponds to what dimensions and measures a Widget actually needs than trying to build complex Reporting Table to power all Widgets. If you have very granular dimensions, it’s even more important and efficient to have multiple tables.

We refer to Reporting Tables that have a lot of columns as wide tables and and Reporting Tables that have less columns as narrow tables. One issue of the wide tables is the size of the data, i.e. number of the rows in the reporting table. Having a lot of rows in a reporting table will make the widgets load slower, and eventually with even more rows, cause widgets to fail to fetch the data in a reasonable time. The exact number of rows that is supportable cannot be defined, but there are guidelines on dimension design.

You can estimate the scale of the number of rows in table by thinking about the count of distinct values in each dimension, in a boolean dimension the count would be two, and in a user ID column it would be equal to the DAU of the application. The number of rows in a Reporting Table scales with the count of distinct values for each dimension.In the case of Reporting Table having Activity Date and a boolean field the count would be in the scale of 365 * 2 rows per year.

With a 1,000,000 DAU game, a reporting table having the user ID and Activity Date would be in the scale of 365 * 1,000,000 rows per year. Adding on top of that table (e.g. dimension user experience having values 0 - 100,000), there’d be billions of rows annually.

As an example, imagine two widgets:

  1. Widget A requiring a granular dimension X

  2. Widget B requiring granular dimension Y

Now there are two options for the Reporting Tables, either to create a table having X & Y as the dimensions or to create two tables, one with X and the other having Y. In the first case there are X * Y rows in the table, in the second case there are two tables with X + Y rows in total, which is much less. Omniata nightly update process is highly optimized for scalability. One important feature is that having multiple tables doesn’t really affect the performance, since the tables are processed in parallel, i.e. events are scanned only was independent of the number of tables. Thus the approach of having multiple tables but narrower is much more powerful.

Additional benefits of multiple tables approach are that it’s much easier to understand the idea behind a narrow table by just looking at its definition versus a definition of a wide table, also it’s less likely to make errors when creating narrow tables than when creating wide tables.

How do I pass a custom timestamp?

A custom timestamp can be passed as a parameter on an event_type to be used for creating custom session lengths or for getting more accurate times for reports when batching events or from players who were offline.

  1. The recommended parameter to include is { "ts" : epoch timestamp } where the epoch timestamp represents the time of the event on the client
  2. Go to Detect Events Types, in the Data > Event TypesThis 'ts' field can be referenced in other Event Fields and User Fields as an event formula
    • After scanning for events, a field called 'ts' can be added to the Data App

How do I return a custom timestamp as a date / string?

  1. You could use the following formula to convert the "ts" parameter to a string:
    •"%Y-%m-%d", tonumber(math.floor(event_kvp['ts'])) or 0)
  2. This will return "2014-02-18" for the epoch timestamp 1392741804

How do I get the time a user first triggers an event?

  1. Create an Event Field with the following:
    • Date Type: String
    • Event Formula:
      • (function() if tonumber(user_vars['name of your user field in step 2']) == nil then return user_vars[''name of your user field in step 2'] else return"%Y-%m-%d", tonumber(math.floor(event_kvp['ts'])) or 0) end end)()
    • For the following events: [the name of your event_type]
  2. Create a User Field with the following:
    • Date Type: String
    • Data Source:
    • Event Field: [name of the event field from step 1]
  3. Update the Event Formula:
    • Update the event formula in the event field in step 1 with machine name of the user field in step 2 (the 'name of your user field in step 2').
  4. Create a reporting table with the following:
    • Dimensions
      • Activity Date
      • Project Name
      • Project ID
      • Acquisition Date
      • First Event User Field
    • Measures
      • Count of Users
      • Count of Events

Getting the First Date of a Revenue Event

  1. Create an Event Field with the following :
    • Name: First Revenue Event Date
    • Machine Name: om_first_revenue_event_date
    • Date Type: String
    • Event Formula:
      if tonumber(user_vars['om_user_first_revenue_event_date']) == nil
      then return user_vars[''om_user_first_revenue_event_date']
      return"%Y-%m-%d", tonumber(math.floor(tonumber(user_state['datestamp']))) or 0)
    • For the following events: [om_revenue]
  2. Create a User Field with the following:
    • Name: User First Revenue Event Date
    • Machine Name: om_user_first_revenue_event_date
    • Date Type: String
    • Data Source:
    • Event Field: [First Revenue Event Date]
  3. Create a Table Field with the following:
    • Name: Days to First Revenue Purchase
    • Machine Name: om_days_to_first_revenue_purchase
  4. Data Type: Integer
  5. SQL Formula: DATEDIFF(DATE(`om_user_first_revenue_event_date`),`om_acquisition_date`)
  6. Create a reporting table with the following:
    • Dimensions
      • Activity Date
      • Project Name
      • Project ID
      • Acquisition Date
      • First Revenue Event Date
      • User First Revenue Event Date
      • Days to First Revenue Purchase
    • Measures
      • Count of Users
      • Count of Events
  7. Run the table from the beginning of when you started sending data to Omniata until today
  8. Create a widget with the following:
    • Dimensions
      • Days to First Revenue Purchase
    • Measures
      • Count of Users

What is Lifetime Value (LTV) and why is it important?

Lifetime Value is a projected value for the total amount a user will spend between when they enter an ecosystem and when they cease to be active. Omniata’s LTV model allows the daily projection of user revenues up to 360 days. At the same time, it can be broken out by Country, Project, Acquisition Date and Publisher by default. LTV is an important metric because deriving a numeric value for future revenues should be the backbone of marketing strategy. Knowing how much a user will ultimately be worth helps determine if ad spend is ROI positive, what countries are profitable to acquire users in, and can aid in projecting revenue. It can also be used to evaluate whether changes to a product were beneficial from a revenue standpoint.

How do I update a lookup table using an API call?

Go to lookup table upload csv page.
e.g. /lookup_tables/123-items/upload_csv

Add api/v1/ before /lookup_tables (we are versioning it here so upgrades to our upload process will not break your upload)
This will be the URL for your request.

Prepare CSV, you may want to check that CSV uploads successfully manually before testing API.

Get API Token, found on /users/edit

In the same folder as the CSV run the following command, remember to update YOUR_API_KEY_IN_ACCOUNT_SETTINGS
test.csv is the name of the CSV file I'm uploading here. You can adjust as needed.
curl -v -i -H 'Authorization: Token token="YOUR_API_KEY_IN_ACCOUNT_SETTINGS"' -X POST -F "lookup_table_csv_upload[delete_existing_records]=true" -F "lookup_table_csv_upload[csv_file]=@test.csv"

If you only wish to upload new records, get rid of -F "lookup_table_csv_upload[delete_existing_records]=true"
If you get a http status 200, that means the file was uploaded successfully. We will send a different status in case of errors.

This article was last updated on February 29, 2016 18:51. If you didn't find your answer here, search for another article or contact our support to get in touch.