Transactional Data Requirements
This page describes the requirements for sharing sample transactional data during the preparation phase of the integration.
After the outlet list has been shared and reviewed, Fyre requires approximately one month of transactional data for the same outlets. This step is mandatory and applies to all integration models.
The purpose of this step is validation and alignment, not production ingestion.
Initial transactional data submission
Before any production or historical data is enabled, a sample of transactional data covering approximately one month must be shared with Fyre.
This initial dataset serves as a baseline to:
Validate structure and identifier alignment
Assess data quality and consistency
Enrich and complete the outlet profiles previously shared
Transactional data should only reference outlets that are included in the approved outlet list.
Why is this data required
The outlet list provides static information about each location, such as name and address.
Transactional data is required to understand how each outlet operates in practice. By analyzing real sales data, Fyre can derive characteristics that cannot be reliably inferred from metadata alone.
This ensures that outlets are placed into the correct market segments and that downstream analytics are accurate and comparable.
What one month of data allows us to do
A one-month sample of transactional data allows Fyre to analyze consumption patterns and derive key outlet characteristics, including:
The type of venue, inferred from what is sold and when
The relative size of the outlet, based on observed sales volume
The balance between food and beverages, and between alcoholic and non-alcoholic drinks
A typical average check size
Indicators of how active and established an outlet is
These insights are attached back to the outlets previously shared in the outlet list, completing their standardized profiles.
Market segmentation and enrichment
Transactional data is also used to place outlets into Fyre’s market segmentation model.
Based on observed sales behavior, outlets are classified using a four-level market segment hierarchy, ensuring consistent segmentation across different POS systems, regions, and markets.
In addition, transactional line items are interpreted and normalized so that:
Products with different naming conventions can be compared
Categories are applied consistently
Brands and brand owners can be identified where possible
This normalization step is essential for producing reliable category, brand, and owner-level analytics.
Scope of the sample data
The transactional data sample should:
Cover approximately one full calendar month
Include only outlets present in the previously shared outlet list
Represent real production data
Test data or partial datasets should be avoided.
Required fields
Regardless of how transactional data is delivered, certain information must always be available.
Each transactional record must include:
A reference to the outlet, matching the outlet list identifier
A unique order identifier
A unique order-item identifier
Product identifier and product name
Quantity and price
At least one reliable timestamp
These elements are required to correctly link transactions to outlets, deduplicate records, and support enrichment and aggregation.
Recommended fields
Some additional fields are not mandatory but are strongly recommended, as they improve enrichment quality and analytical accuracy.
When available, this includes:
Category identifiers and category names
Tax amounts and tax rates
Currency
Number of guests
Cost or purchase price
Item relationships, such as combos, modifiers, or toppings
Additional timestamps
Providing these fields improves normalization, segmentation, and revenue estimation.
Identifier rules
Identifier consistency is critical for transactional data.
Outlet references must match exactly the identifiers used in the outlet list
Order identifiers must be unique per outlet, not globally
Order-item identifiers must be unique per outlet
Identifiers must remain stable over time
Timestamp expectations
Each transactional record must include one mandatory timestamp and may include additional optional timestamps that provide more context about the order lifecycle.
Order creation timestamp (mandatory) This timestamp represents when the order was created or opened in the POS system. It must be clearly identifiable, use a consistent format across all records, and be expressed in a single, consistent timezone. The timezone does not need to be UTC. If a local timezone is used, it must be applied consistently and communicated during onboarding.
Payment timestamp (optional, strongly recommended) This timestamp represents when the order was paid or closed. It is often different from the creation time and is important for understanding service duration, assigning sales to the correct business day, and improving time-based analysis.
When multiple timestamps are provided, they must be clearly distinguishable by name, follow the same formatting and timezone conventions, and consistently represent the same lifecycle events across all records.
If only one timestamp is available, it must represent the order creation time.
When using Fyre’s standardized transactional format, timestamps are expected in ISO 8601 format and expressed in UTC. This requirement applies only when the standardized format is used.
Standardized transactional format
To simplify onboarding and long-term maintenance, Fyre provides a standardized transactional data format.
Using this format is optional but strongly recommended. It can be reused for the initial sample, historical backfills, and ongoing daily data delivery.
Field-level definitions and examples are available in the Standardized Transactional Format section.
How to deliver the sample data
This page defines what data is required, not how it is delivered.
Details about delivery mechanisms are covered in the following sections:
Partners should review these sections to determine which delivery model applies to their setup.
What happens after submission
Once the sample transactional data has been submitted, Fyre carries out a structured review to confirm that the data is ready to move forward.
During this review, the data is checked for structural completeness, correct use of identifiers and timestamps, and overall consistency with the previously shared outlet list. Fyre also performs initial quality checks and validates that the data can be reliably enriched and segmented.
If any adjustments are required, Fyre will share clear and actionable feedback with the partner, allowing issues to be resolved before production data is enabled.
After the sample data has been reviewed and approved, the final integration model is confirmed. Where applicable, API access is enabled, historical data delivery is scheduled, and daily production data flows are activated.
Next step
If you want to align with Fyre’s preferred structure, see Standardized Transactional Format for field-level definitions and examples.
Last updated