Back to Insights
Data Engineering 3/5/2024 5 min read

Building a Server-Side Event Decision Engine for GA4: Dynamic Transformations with GTM, Cloud Run & Firestore

Building a Server-Side Event Decision Engine for GA4: Dynamic Transformations with GTM, Cloud Run & Firestore

You've built a robust server-side Google Analytics 4 (GA4) pipeline, leveraging Google Tag Manager (GTM) Server Container on Cloud Run for centralized data collection, transformations, enrichment, and granular consent management. This architecture provides unparalleled control and data quality, forming the backbone of your modern analytics strategy.

However, even with such a powerful data pipeline, a critical challenge often remains: how do you ensure the data sent to GA4 precisely aligns with your real-time business logic and strategic goals, without resorting to complex client-side code or static server-side configurations?

Your server-side GTM (GTM SC) now has access to a wealth of enriched context: user loyalty tiers (from BigQuery), A/B test variants (from Firestore), consent statuses, and more. The problem is that GA4's event model, while flexible, sometimes requires more dynamic, conditional adjustments to event payloads before they are dispatched.

Imagine scenarios like these:

  • Conditional Event Renaming: A "purchase" event with a value below a certain threshold ($10) should be logged as a "micro_purchase" in GA4 to avoid skewing high-value transaction reports.
  • Dynamic Parameter Adjustment: For users identified as "churn risk" by your real-time segmentation, you might want to send a custom risk_score parameter or even adjust the value of their events (e.g., for ad bidding optimization) to signal a lower quality conversion.
  • Event Filtering based on Complex Rules: Automatically drop "add_to_cart" events for specific item_ids that are permanently out of stock and can't be fulfilled, rather than sending misleading inventory signals.
  • Targeted Data Suppression: If a specific product category is undergoing maintenance, you might want to stop sending view_item events for those products to GA4 to avoid irrelevant data.

Relying on client-side JavaScript for these dynamic event modifications is brittle, easily bypassed, and adds unnecessary load. Static server-side GTM tags can handle some conditions, but for complex, evolving business rules, a dedicated "decision engine" approach is far more robust and agile.

The core problem is the need for a flexible, real-time server-side mechanism that can dynamically transform, filter, or modify GA4 event payloads based on sophisticated, configurable business rules and enriched contextual data, all within your GTM Server Container pipeline.

Why Server-Side for a Dynamic Event Decision Engine?

Implementing an event decision engine within your GTM Server Container on Cloud Run offers significant advantages:

  1. Unified Data Context: The engine has access to all the enriched event data already processed by your GTM SC (PII-scrubbed, consent-aware, user-stitched, enriched with CRM, product, or A/B test data).
  2. Centralized Control & Consistency: Define and manage all your event transformation rules in a single, controlled environment, ensuring consistent application across all events.
  3. Real-time Reactivity: Decisions are made and modifications applied with minimal latency, ensuring GA4 receives the most relevant and accurate data instantly.
  4. Agile Business Logic: Update business rules (e.g., change micro-purchase threshold, add new churn risk segments) by modifying a central configuration (e.g., in Firestore) without code deployments.
  5. Reduced Client-Side Complexity: Offload complex conditional logic from the browser to a scalable serverless environment, improving page load performance.
  6. Enhanced Data Quality & Actionability: Ensure GA4 receives data that is perfectly aligned with your business's strategic reporting and activation needs, leading to cleaner reports and more impactful decisions.

Our Solution Architecture: GTM SC → Event Decisioning Service → Modified Event → GA4

We'll extend your existing server-side GA4 architecture by introducing a dedicated Event Decisioning Service built on Cloud Run and leveraging Firestore for dynamic rules. This service will be called early in your GTM Server Container's processing flow, reacting to fully enriched events and returning precise instructions for event modification.

graph TD
    subgraph User Interaction
        A[User Browser/Client-Side] -->|1. Event (e.g., 'purchase')| B(GTM Web Container);
    end

    subgraph GTM Server Container Processing (on Cloud Run)
        B -->|2. HTTP Request to GTM SC Endpoint| C(GTM Server Container on Cloud Run);
        C --> D[3. Full GTM SC Processing: <br>Data Quality, PII Scrubbing, Consent, Enrichment, Identity Resolution, Schema Validation];
        D --> E[4. Fully Enriched Event Data (Internal)];
        E -->|5. Custom Tag: Call Event Decisioning Service| F(Event Decisioning Service on Cloud Run);
    end

    subgraph Event Decisioning Service
        F -->|6. Look up Dynamic Business Rules| G[Firestore: Event_Rules Collection];
        F -->|7. Evaluate Rules & Suggest Modifications| F;
        G -->|8. Return Suggested Modifications (e.g., new_event_name, parameter_changes, drop_event_flag)| F;
    end

    F -->|9. Apply Modifications to Event Data| E;
    E -->|10. Dispatch to GA4 Measurement Protocol| H[Google Analytics 4];
    E -->|11. (Parallel) Dispatch to Other Platforms| I[Google Ads, Facebook CAPI, etc.];
    E -->|12. (Parallel) Log to Raw Data Lake| J[BigQuery Raw Event Data Lake];

Key Flow:

  1. Client-Side Event: A user interaction (e.g., purchase) triggers an event, sent from the GTM Web Container to your GTM Server Container.
  2. GTM SC Processes & Enriches: The GTM SC receives the event and runs through all your pre-configured data quality, PII scrubbing, consent, enrichment (e.g., loyalty tier, customer segment), and identity resolution steps. This results in a fully enriched event data payload.
  3. Call Event Decisioning Service: A new, high-priority custom tag in GTM SC sends this fully enriched event data to your Event Decisioning Service (Cloud Run).
  4. Decisioning Logic (Cloud Run): This Python service receives the enriched event. It then:
    • Looks up dynamic Event Rules from a Firestore collection (e.g., "if event is 'purchase' AND value < 10, then suggest renaming event to 'micro_purchase'").
    • Applies these rules to the incoming event data.
    • Generates a set of suggested modifications (e.g., a new event name, a list of parameters to add/modify/delete, or a flag to drop the event entirely).
  5. Apply Modifications in GTM SC: The GTM SC custom tag receives these suggestions and uses setInEventData() or deleteFromEventData() to modify the event payload before any GA4 tags are triggered. If the drop_event_flag is set, data.gtmOnFailure() prevents further processing.
  6. Dispatch to GA4: The event, now dynamically transformed according to your business rules, proceeds to be dispatched to GA4 and other platforms for traditional analytics and reporting.

Core Components Deep Dive & Implementation Steps

1. Firestore Setup: event_rules Collection

Firestore is ideal for storing dynamic business rules due to its low-latency reads and flexible document structure.

a. Create a Firestore Database:

  1. In the GCP Console, navigate to Firestore.
  2. Choose "Native mode" and select a region close to your Cloud Run services.

b. Structure Your event_rules Collection:

Document ID (e.g., rule_id)Fields
rename_micro_purchasetarget_event_name: 'purchase'
conditions: JSON ({'value': {'lessThan': 10}})
action_type: 'rename_event'
action_details: JSON ({'new_event_name': 'micro_purchase'})
priority: 10
is_active: true
adjust_churn_risk_valuetarget_event_name: 'purchase'
conditions: JSON ({'user_loyalty_tier': 'Churn Risk'})
action_type: 'modify_parameter'
action_details: JSON ({'parameter': 'value', 'operator': 'multiply', 'operand': 0.5})
priority: 20
is_active: true
remove_oos_itemstarget_event_name: 'add_to_cart'
conditions: JSON ({'items.$.stock_status': {'equals': 'out_of_stock'}})
action_type: 'filter_items'
action_details: JSON ({'item_property': 'stock_status', 'operator': 'not_equals', 'operand': 'out_of_stock'})
priority: 30
is_active: true
drop_maintenance_product_viewstarget_event_name: 'view_item'
conditions: JSON ({'item_id': {'in': ['PROD_MAINT_001', 'PROD_MAINT_002']}})
action_type: 'drop_event'
action_details: JSON ({})
priority: 5
is_active: true

2. Python Event Decisioning Service (Cloud Run)

This Flask application receives the enriched event, queries Firestore for rules, evaluates them, and returns suggested modifications.

decision-engine-service/main.py example:

import os
import json
import datetime
import time
from flask import Flask, request, jsonify
from google.cloud import firestore
import logging

app = Flask(__name__)
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

# Initialize Firestore client
db = firestore.Client()
logger.info("Firestore client initialized for Event Decisioning Service.")

# Cache for event rules (assuming rules don't change by the millisecond)
_rules_cache = None
_last_rules_fetch_time = 0
RULE_CACHE_DURATION_SECONDS = 60 # Refresh rules every 60 seconds

def fetch_event_rules():
    global _rules_cache, _last_rules_fetch_time
    current_time = time.time()

    if _rules_cache is None or (current_time - _last_rules_fetch_time) > RULE_CACHE_DURATION_SECONDS:
        logger.info("Fetching event rules from Firestore...")
        rules = []
        rules_ref = db.collection('event_rules')
        # Order by priority to ensure higher priority rules are processed first
        # (Firestore doesn't support ordering on subfields like JSON path, so keep simple)
        for doc in rules_ref.where('is_active', '==', True).order_by('priority', direction=firestore.Query.ASCENDING).stream():
            rule = doc.to_dict()
            rule['id'] = doc.id
            rules.append(rule)
        _rules_cache = rules
        _last_rules_fetch_time = current_time
        logger.info(f"Fetched {len(rules)} active event rules.")
    return _rules_cache

def evaluate_condition(event_data, condition_key, operator, operand):
    """Evaluates a single condition against event_data."""
    # Handle nested keys like 'items.$.stock_status'
    parts = condition_key.split('.')
    current_value = event_data
    for part in parts:
        if part == '$' and isinstance(current_value, list): # Special handling for array items
            # This requires iterating through all items and checking each
            # For this simple example, we'll assume the rule targets 'any' item matching
            return any(evaluate_condition(item, '.'.join(parts[parts.index('$')+1:]), operator, operand) for item in current_value)
        
        if isinstance(current_value, dict) and part in current_value:
            current_value = current_value[part]
        else:
            current_value = None # Key not found
            break

    if operator == 'equals':
        return current_value == operand
    elif operator == 'not_equals':
        return current_value != operand
    elif operator == 'greaterThan':
        return isinstance(current_value, (int, float)) and current_value > operand
    elif operator == 'lessThan':
        return isinstance(current_value, (int, float)) and current_value < operand
    elif operator == 'in':
        return current_value in operand # operand should be a list
    elif operator == 'exists':
        return current_value is not None
    elif operator == 'not_exists':
        return current_value is None
    # Add more operators as needed
    return False

def apply_rule_actions(event_payload, rule, modifications):
    """Applies the actions of a triggered rule to the event_payload."""
    action_type = rule.get('action_type')
    action_details = rule.get('action_details', {})
    
    if action_type == 'rename_event':
        new_event_name = action_details.get('new_event_name')
        if new_event_name:
            event_payload['event_name'] = new_event_name
            modifications.append({'type': 'rename_event', 'old_name': rule.get('target_event_name'), 'new_name': new_event_name})
    
    elif action_type == 'modify_parameter':
        param_key = action_details.get('parameter')
        operator = action_details.get('operator')
        operand = action_details.get('operand')
        
        if param_key and operator and operand:
            # Handle nested keys for modification
            parts = param_key.split('.')
            target = event_payload
            for i, part in enumerate(parts):
                if i == len(parts) - 1: # Last part, this is the parameter to modify
                    if part in target and isinstance(target[part], (int, float)):
                        if operator == 'multiply':
                            target[part] *= operand
                            modifications.append({'type': 'modify_parameter', 'key': param_key, 'operator': operator, 'operand': operand, 'new_value': target[part]})
                        elif operator == 'add':
                            target[part] += operand
                            modifications.append({'type': 'modify_parameter', 'key': param_key, 'operator': operator, 'operand': operand, 'new_value': target[part]})
                    elif part not in target and operator == 'set_default': # Example: set if not present
                         target[part] = operand
                         modifications.append({'type': 'modify_parameter', 'key': param_key, 'operator': operator, 'operand': operand, 'new_value': target[part]})
                else:
                    if part not in target or not isinstance(target[part], dict):
                        target[part] = {} # Create path if it doesn't exist
                    target = target[part]

    elif action_type == 'filter_items':
        item_property = action_details.get('item_property')
        operator = action_details.get('operator')
        operand = action_details.get('operand')

        if 'items' in event_payload and isinstance(event_payload['items'], list) and item_property and operator:
            original_item_count = len(event_payload['items'])
            filtered_items = []
            for item in event_payload['items']:
                if evaluate_condition(item, item_property, operator, operand):
                    filtered_items.append(item)
            event_payload['items'] = filtered_items
            modifications.append({'type': 'filter_items', 'original_count': original_item_count, 'filtered_count': len(filtered_items)})
    
    elif action_type == 'drop_event':
        modifications.append({'type': 'drop_event'})
        # This will be handled by the GTM SC Custom Tag by returning a flag

    # Update event_name if it was renamed
    if 'new_event_name' in [m['new_name'] for m in modifications if m['type'] == 'rename_event']:
        event_payload['_original_event_name'] = rule.get('target_event_name') # Store original for audit
    
    return modifications # Return what was applied

@app.route('/make-event-decision', methods=['POST'])
def make_event_decision():
    """
    Receives fully enriched event data from GTM Server Container,
    applies dynamic business rules, and returns suggested modifications.
    """
    if not request.is_json:
        logger.warning(f"Request is not JSON. Content-Type: {request.headers.get('Content-Type')}")
        return jsonify({'error': 'Request must be JSON'}), 400

    try:
        enriched_event = request.get_json()
        event_name = enriched_event.get('event_name')
        client_id = enriched_event.get('_event_metadata', {}).get('client_id') # For logging
        
        if not event_name or not client_id:
            logger.error("Missing event_name or client_id in request for event decisioning.")
            return jsonify({'error': 'Missing critical event identifiers'}), 400
        
        logger.info(f"Evaluating rules for event '{event_name}' (Client ID: {client_id[:10]}...).")

        # Create a deep copy of the event to apply modifications
        mutable_event_payload = json.loads(json.dumps(enriched_event))
        
        rules = fetch_event_rules()
        applied_modifications = []
        drop_event = False

        for rule in rules:
            if not rule.get('is_active'):
                continue

            target_event = rule.get('target_event_name')
            if target_event != event_name:
                continue

            conditions = rule.get('conditions', {})
            all_conditions_met = True
            for condition_key, condition_details in conditions.items():
                operator = next(iter(condition_details)) # e.g., 'lessThan'
                operand = condition_details[operator]
                if not evaluate_condition(mutable_event_payload, condition_key, operator, operand):
                    all_conditions_met = False
                    break
            
            if all_conditions_met:
                logger.info(f"Rule '{rule['id']}' triggered for event '{event_name}'. Applying action: {rule.get('action_type')}")
                mod_result = apply_rule_actions(mutable_event_payload, rule, applied_modifications)
                
                # Check if a 'drop_event' action was triggered
                if any(m['type'] == 'drop_event' for m in mod_result):
                    drop_event = True
                    break # Stop processing further rules if event is to be dropped

        decision_result = {
            'status': 'success',
            'original_event_name': event_name,
            'resolved_event_payload': mutable_event_payload,
            'applied_modifications': applied_modifications,
            'drop_event': drop_event
        }
        
        logger.info(f"Decision for event '{event_name}': Drop={drop_event}, Modifications={len(applied_modifications)}")
        return jsonify(decision_result), 200

    except Exception as e:
        logger.error(f"Error during event decisioning for event {event_name}: {e}", exc_info=True)
        # On error, default to not dropping event and return original payload to avoid data loss
        return jsonify({
            'error': str(e),
            'status': 'failed',
            'original_event_name': event_name,
            'resolved_event_payload': enriched_event, # Return original payload on error
            'applied_modifications': [],
            'drop_event': False
        }), 500

if __name__ == '__main__':
    app.run(debug=True, host='0.0.0.0', port=int(os.environ.get('PORT', 8080)))

decision-engine-service/requirements.txt:

Flask
google-cloud-firestore
jsonschema # While not directly used for validation here, could be for complex schema rules

Deploy the Python service to Cloud Run:

gcloud run deploy event-decision-engine-service \
    --source ./decision-engine-service \
    --platform managed \
    --region YOUR_GCP_REGION \
    --allow-unauthenticated \
    --set-env-vars GCP_PROJECT_ID="YOUR_GCP_PROJECT_ID" \
    --memory 512Mi \
    --cpu 1 \
    --timeout 15s # Allow enough time for Firestore queries and rule evaluation

Important:

  • Replace YOUR_GCP_PROJECT_ID and YOUR_GCP_REGION with your actual values.
  • The --allow-unauthenticated flag is for simplicity. In production, consider authenticated invocations.
  • Ensure the Cloud Run service identity has the roles/datastore.user role (Firestore read access) on your GCP project.
  • Note down the URL of this deployed Cloud Run service.

3. GTM Server Container Custom Tag: Event Decision Orchestrator

This custom tag will fire after all your event enrichment and identity resolution is complete. It sends the complete event data to the Event Decisioning Service, processes the returned modifications, and applies them to the event payload.

GTM SC Custom Tag Template: Event Decision Orchestrator

const sendHttpRequest = require('sendHttpRequest');
const JSON = require('JSON');
const log = require('log');
const getEventData = require('getEventData');
const setInEventData = require('setInEventData');
const deleteFromEventData = require('deleteFromEventData');

// Configuration fields for the template:
//   - decisionEngineServiceUrl: Text input for your Cloud Run Event Decisioning Service URL
//   - enableDecisionEngine: Boolean checkbox to enable/disable (for testing)

const decisionEngineServiceUrl = data.decisionEngineServiceUrl;
const enableDecisionEngine = data.enableDecisionEngine === true;

if (!enableDecisionEngine) {
    log('Event Decision Engine is disabled. Skipping decisioning.', 'DEBUG');
    data.gtmOnSuccess();
    return;
}

if (!decisionEngineServiceUrl) {
    log('Event Decision Engine Service URL is not configured. Skipping.', 'ERROR');
    data.gtmOnSuccess(); // Do not block other tags
    return;
}

// Get the fully enriched event payload from GTM SC's context
const enrichedEventPayload = getEventData();

log('Sending enriched event to Event Decisioning Service...', 'INFO');

sendHttpRequest(decisionEngineServiceUrl + '/make-event-decision', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify(enrichedEventPayload),
    timeout: 5000 // 5 seconds timeout for service call
}, (statusCode, headers, body) => {
    if (statusCode >= 200 && statusCode < 300) {
        try {
            const response = JSON.parse(body);
            const dropEvent = response.drop_event === true;
            const resolvedPayload = response.resolved_event_payload || enrichedEventPayload;
            const appliedModifications = response.applied_modifications || [];

            setInEventData('_decision_engine_modifications', appliedModifications, true); // Log modifications for audit

            if (dropEvent) {
                log('Decision Engine instructed to DROP event. Preventing subsequent tags from firing.', 'WARNING');
                setInEventData('_event_decision_status', 'dropped', true);
                data.gtmOnFailure(); // Crucial: Stop all subsequent tags
                return;
            }

            // Apply modifications from the resolvedPayload to the current eventData
            log('Decision Engine returned modifications. Applying to eventData.', 'INFO');
            setInEventData('_event_decision_status', 'modified_or_passed', true);

            for (const key in resolvedPayload) {
                setInEventData(key, resolvedPayload[key], false); // False for ephemeral to ensure persistent change
            }
            // If the event was renamed, keep original name for audit
            if (resolvedPayload._original_event_name) {
                setInEventData('_original_event_name', resolvedPayload._original_event_name, true);
            }

            data.gtmOnSuccess();

        } catch (e) {
            log('Error parsing Event Decision Engine service response:', e, 'ERROR');
            setInEventData('_event_decision_status', 'error_parsing_response', true);
            data.gtmOnSuccess(); // Continue processing on parsing error, log failure
        }
    } else {
        log('Event Decision Engine service call failed:', statusCode, body, 'ERROR');
        setInEventData('_event_decision_status', 'error_http_call', true);
        data.gtmOnSuccess(); // Continue processing on HTTP error, log failure
    }
});

Implementation in GTM SC:

  1. Create a new Custom Tag Template named Event Decision Orchestrator (grant Access event data, Send HTTP requests).
  2. Create a Custom Tag (e.g., GA4 Event Modifier) using this template.
  3. Configure decisionEngineServiceUrl with the URL of your Cloud Run service.
  4. Set enableDecisionEngine to true (checkbox checked).
  5. Trigger: Set the trigger for this tag to All Events (or specific, relevant events like page_view, purchase, add_to_cart). Ensure it has a very high priority (e.g., 150after PII scrubbing, enrichment, identity resolution, schema validation, but before GA4 or other platform tags). This allows it to act on the most complete data and influence the final payload dispatched.

After this tag fires, your GTM SC's eventData will reflect any modifications suggested by the decision engine, or the event might be dropped entirely.

4. Utilizing Modified Events in GA4 and Other Platforms

Once the Event Decision Orchestrator has run, your eventData in the GTM Server Container will be updated.

a. Google Analytics 4 (GA4) Tags:

  • Your existing GA4 Configuration and Event Tags will simply use the modified event_name and event parameters that are now available in eventData. No changes needed directly in the GA4 tags themselves.
  • For renamed events (e.g., micro_purchase), ensure you have custom definitions configured in GA4 if you want to report on those specifically.
  • For dropped events, they simply won't appear in GA4, achieving your filtering goal.

b. Raw Event Data Lake (for Audit):

  • If you're implementing a raw event data lake, ensure your ingestion service logs both the original event payload (captured before this decision engine runs) and the _event_decision_status and _decision_engine_modifications parameters set by this tag. This provides a crucial audit trail, showing exactly what was modified or dropped and why.

Benefits of This Server-Side Event Decision Engine Approach

  • Precise Data Alignment: Guarantee that GA4 data precisely reflects your evolving business definitions and strategic priorities, leading to more accurate reporting and ROI calculations.
  • Ultimate Control: Achieve granular control over event names, parameters, and even event presence, directly from a centralized, server-side rule engine.
  • Agile Business Rules: Marketing and product teams can rapidly define, update, and deploy complex business rules (e.g., for segmentation, value adjustments) by modifying Firestore, without requiring code deployments or GTM Web Container updates.
  • Enhanced Data Quality: Proactively transform and filter events that don't meet specific business criteria, reducing noise and improving the actionability of your GA4 data.
  • Reduced Client-Side Complexity: Offload intricate conditional logic from the browser to a scalable serverless environment, improving website performance and stability.
  • Unified Strategy: Ensure consistent application of business logic across all analytics and advertising platforms, as decisions are made from a single source of truth.
  • Auditability & Transparency: Firestore logs provide a clear record of your business rules, and GTM SC logs (especially to a raw data lake) provide an audit trail of applied modifications.

Important Considerations

  • Latency: Calling an external Cloud Run service introduces a small amount of latency to your initial GTM SC processing. Monitor this closely using Cloud Monitoring. The benefits of precise business alignment usually outweigh this minor overhead.
  • Cost: Firestore reads and Cloud Run invocations incur costs. Optimize rule complexity and rule-fetching (e.g., caching rules within the Cloud Run service) to manage expenses for high-volume sites.
  • Rule Management Complexity: As your rule set grows, managing rules directly in Firestore might become complex. Consider a dedicated UI layer for business users or integrating with a more sophisticated rule management system if your needs are extensive.
  • Order of Operations: The placement of the Event Decision Orchestrator tag within your GTM Server Container is crucial. It must run after all relevant enrichment and identity resolution, but before any downstream tags (like GA4, Facebook CAPI) would typically fire.
  • Error Handling & Fallbacks: Implement robust error handling in both the Cloud Run service and the GTM SC custom tag to gracefully manage cases where the service is unavailable or returns errors. The example defaults to not dropping the event if the service fails, ensuring some data flow, but a stricter policy might be needed based on your compliance needs.
  • Impact on Existing Reports: Dynamically renaming or dropping events will affect your GA4 reports. Plan for these changes and update your reporting dashboards or custom dimensions accordingly.

Conclusion

Achieving granular control over your GA4 event data, allowing for dynamic transformations based on real-time business rules, is a game-changer for data-driven organizations. By building a server-side event decision engine with GTM Server Container, a dedicated Python Cloud Run service, and dynamic rules stored in Firestore, you empower your analytics pipeline to reflect the nuanced priorities of your business. This advanced server-side capability not only ensures cleaner, more actionable GA4 data but also drives greater agility, better alignment between analytics and operations, and ultimately, more confident, data-driven decisions. Embrace server-side event decisioning to unlock the full potential of your analytics investments.


Need Help With [Server-Side GA4 Event Decision Engines]?

If you're struggling with [dynamically transforming your GA4 event data based on complex business rules], our team can help. Book a free 15-minute audit to identify what's broken and how to fix it.