Skip to main content

Distributed Splunk Environment

UXM is setup to handle 10,000+ Desktop agents and million of Web page requests per day.

The recommended architecture is to setup an Splunk Heavy-Forwarder with UXM (containing the NGINX/RabbitMQ queue) and send data via HTTP Event Collector (HEC) to the indexers.

Setup Splunk Indexers

Install Indexer App

Install the app "uxmapp_indexer_YYYY.MM.DD.tar.gz" on the Splunk Indexers.

Installing the indexer app creates the following UXM indexes:

Index nameTypeDescription
uxmapp_responseEvents
uxmapp_sessiondataEvents
uxmapp_metricsMetricsMetric store with high performance metrics for charts with limited dimentions.
uxmapp_metrics_rollupMetricsHourly rollup of Metric for long term reporting and fastest performance.
uxmapp_confidentialEventsConfidential data that only a limited number of people can access and view.
uxmapp_si_hourlyEventsFor hourly summary index rollups.
uxmapp_si_quarterlyEventsFor quarterly summary index rollups.

Activate HTTP Event Collector

Activate the HTTP Event Collector (HEC) on the indexers that should receive the UXM data.

This is done under Settings -> Data Inputs -> HTTP Event Collector -> Global Settings

Write down the FQDN/IP of the Indexer, if SSL is enabled and Port number (Default 8088), these settings will be used later when setting up the Heavy Forwarder.

Create HTTP Event Collector for Receiving UXM Data

Create new HTTP Event Collector and call it "UXM - uxmapp", indexer acknowlegement has to be disabled.

Select:

  • Source Type: Automatic.
  • App Context: UXM Indexers (uxmapp_indexer).
  • Indexes: Select the 4 indexes uxmapp_confidential, uxmapp_metrics, uxmapp_response, uxmapp_sessiondata.
  • Default Index: uxmapp_response.

Press Preview and Submit, write down the token value, the settings will be used when configuring the Heavy Forwarder and Search Head.

Setup Splunk Search Head

The splunk search head contains dashboards and data models and is where the user analyses the UXM data.

Please note that multiple scheduled searches which creates summary indexes are created by the UXM app, these requires that you follow Splunk best practices and forwards all data from the Search Heads to the Indexers.

Install App

Install the following apps on the Search Head. You can skip the restart untill later.

  • Search Head app: uxmapp_searchhead_YYYY.MM.DD.tar.gz
  • Custom visualization: uxmapp_waterfall_YYYY.MM.DD.tar.gz
  • Custom visualization: uxmapp_worldmap_YYYY.MM.DD.tar.gz

Go to Settings -> Data Inputs -> Scripts and enable the script setup/distributed_searchhead_000_setup_app.py. (The script creates default KVStores entries, Splunk roles and Splunk user that allows Heavy Forwarders to access the KVStore on the Search Head), it will auto disable when done.

You can also follow this guide to "Setup Search Head Manually" if you prefer to configure Splunk manually.

You can view the output of the script by running the following Splunk search:

index="_internal" source="*_setup_distributed_searchhead_000_setup_app.log"

Verify Roles

There will be 2 new roles after the script has executed called: uxmapp_user and uxmapp_admin:

And a user called uxmapp_wsgi, reset the password for the user and disable that password change is required on next login, store the password it will be used later when setting up the Heavy Forwarder.

Setup/Verify Permissions for App

Go to Apps -> Manage Apps and click permissions on the uxmapp app.

Add read permissions for the newly created uxmapp_user and read+write permissions for the uxmapp_admin user.

Setup/Verify UXM Configuration

Open the UXM app, it will ask you to configure it, enter HTTP Event Collector Hostname and Token.

Apply the license, leave rest of values as default and press save.

Enable Splunk Batch Processing Scripts

Enable following Data Input script under Settings -> Data Input -> Scripts:

  • check_license.py
  • daily_maintenance.py
  • task_generate_tags.py
  • update_alert_event_summaries.py
  • update_kvstores.py
  • update_applications.py
  • update_endpoint_groups.py

The Splunk Search Head needs to be restarted afterwards when all configuration is done.

Setup Heavy Forwarder

The Splunk Heavy Forwarder (HF) receives the data and processes it according to the configuration on the Splunk Search Head KVStores. It also respond with configuration to the UXM Desktop agents when they synchronize hourly.

NGINX (Linux) or IIS (Windows) and RabbitMQ is needed to control the data retrieval and queuing to avoid overloading the HF or Splunk environment, because receiving data from Desktop endpoint and public websites requires a high number of TCP connections.

Setup RabbitMQ

Login to the heavy forwarder and install RabbitMQ following the official guides:

Install newest version of RabbitMQ and Erlang following guide https://www.rabbitmq.com/install-debian.html

We recommend locking the package, all features have to be enabled before upgrading when performing major upgrades from 3.11.x to 3.12.x, etc, which breaks rabbitmq-server when upgrading.

sudo rabbitmqctl enable_feature_flag all
sudo apt-mark hold rabbitmq-server

After installation enable the management web interface and add the UXM user.

info

Remember to generate a password and replace it with the "{GeneratedPassword}".

sudo rabbitmq-plugins enable rabbitmq_management
sudo service rabbitmq-server start
sudo rabbitmqctl add_user uxmapp {GeneratedPassword}
sudo rabbitmqctl set_user_tags uxmapp monitoring
sudo rabbitmqctl add_vhost /uxmapp/
sudo rabbitmqctl set_permissions -p /uxmapp/ uxmapp ".*" ".*" ".*"
sudo rabbitmqctl delete_user guest
sudo apt-mark hold rabbitmq-server

Install App

Install the app "uxmapp_heavyforwarder_YYYY.MM.DD.tar.gz" on the Splunk Heavy Forwarder. You can skip the restart untill later.

Configure App

Open the UXM app, it will ask you to configure it, use same Agent Key as the Search Head, enter KVStore, HTTP Event Collector and RabbitMQ settings, leave rest of values as default and press save.

Storage path is for UXM Desktop agent log files and UXM Robot agent video, screenshot and log results, can be skipped.

Save the generated Agent Key for later when deploying the UXM Desktop agent to endpoints. See Deploying Desktop Agents

Save and restart Splunk.

Setup uWSGI/Python

Install Python3 and activate the uWSGI environment by running the following commands

sudo apt-get update
sudo apt-get -y install python3-pip python3-virtualenv

Setup uWSGI environment using non root user:

cd SPLUNK_HOME/etc/apps/uxmapp/bin/wsgi/
sudo mkdir -p /var/log/uwsgi/ && sudo chown -R splunk:splunk /var/log/uwsgi/
sudo virtualenv -p python3 ../wsgi/ && chown -R splunk:splunk ../wsgi/
source ../wsgi/bin/activate
pip install uwsgi
deactivate

Add WSGI data receiver as service that starts with the server

echo "Auto-start uWSGI on boot"
sudo cp ../wsgi/wsgi-uxm.template.service /etc/systemd/system/wsgi-uxm.service

# Check that uxmapp folder is correct in params: WorkingDirectory, Environment and ExecStart
# sudo vi /etc/systemd/system/wsgi-uxm.service

sudo systemctl enable wsgi-uxm && sudo systemctl restart wsgi-uxm
sudo systemctl status wsgi-uxm

After commands is executed systemctl status wsgi-uxm should show the following output:

Setup NGINX (Linux) or IIS (Windows)

NGINX/IIS is used to create web front for uWSGI data receiver.

We recommend for increased security that you setup HTTPs certificates or use an Reverse Proxy if data has to be received from outside the company network.

UXM Web agent and UXM Browser extensions requires that valid HTTPs certificate is configured, because data is send directly from the users browser using the same HTTP/HTTPs security that monitored website has. (UXM Desktop Agents after version 2025.01.30 can send browser data through UXM Desktop agent)

Used forReverse Proxy endpointSplunk consumer script
Desktop/Endpoint agent data receivinghttps://fqdn/data/pcagent
Add request to RabbitMQ queue
bin\task_mq_consumer_pcagent.py reads from RabbitMQ queue.
Web agent data receivinghttps://fqdn/data/browser
Adds request to RabbitMQ queue

Have to respond with following headers:

Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET, POST, OPTIONS
Access-Control-Allow-Headers: origin, content-type, accept, LoginRequestCorrelationId
Content-Type: text/plain

bin\task_mq_consumer_web.py reads from RabbitMQ queue.
sudo apt-get install nginx-light

See Configure NGINX for setting up NGINX

Enable Splunk Batch Processing Scripts

Enable scripts for UXM Web and UXM Desktop agent data processing if technologies are used.

Go to Settings -> Data inputs -> Script and enable:

  • "task_mq_consumer_pcagent.py consumer1"
  • "task_mq_consumer_web.py consumer1"

Check for Errors

Open the UXM app on the Search Head and select Reports -> Maintenance Reports -> Maintenance - Self-monitoring for UXM, the dashboard show status on installation and report any errors detected.

PCAgent and Web consumer will show following info if everything works: