Greeting Message A/B Testing


Pre-requisites: To run a greeting message A/B test, you will need ‘Write’ access to the Conversations resource set.

The greeting message is your first touchpoint with your prospects, and making a good first impression greatly improves your odds at engaging and converting a visitor. Greeting message A/B Testing allows you to algorithmically test and prune your greeting messages across your audience or to specific segments over a test interval. 

Benefits of Greeting message A/B Testing

  • Measure and improve your greeting messages across your audience.
  • Target and run tests to understand what messages best cater to a specific segment of your audience base. E.g., Returning visitors, visitors spending more than 'x' seconds on your website, etc.

What is A/B Testing?

A/B testing is a method used to compare two options against each other to determine which one performs better. Often two or more options are evaluated as a part of the A/B testing exercise. The different options are called ‘variants’, and variants are shared across an audience at random, and statistical analysis dictates which variant performs the best.

A/B testing greeting messages in Insent


  • In Insent A/B testing can be performed to compare a greeting message's performance across a common audience segment and based on specific conditions.
  • Greeting messages are shown to a specific audience in a round-robin fashion.

Test Parameters

  • The duration of a greeting message A/B test should be a minimum of one day long and can be as long as you would like. The duration of an A/B test is measured in days.
  • There can be up to four variants added to a greeting message step.


  • There can be one winner in the greeting message A/B test. All other variants will be considered eliminated.
  • When the test ends, the greeting message with the most engagement will be declared a winner. It will become the greeting message displayed to all visitors.
  • A test report will be generated at the end of the experiment duration describing the number of visitors the greeting was shown to and each variant's engagement rate. 

Configure a greeting message A/B test

Locate the conversation where you would like to set up a greeting message A/B test.

Add a greeting message step if it is not already present. The greeting message step should show an option to add an A/B testing variant.

Add A/B Test variant

Add an A/B testing variant option in the greeting message step.

Click the Add A/B testing variant. You can add up to four variants in an A/B test experiment. Next, choose the duration of your A/B test. Your A/B test should be at least a day long.

How an A/B testing step feels like

A sample greeting message step with three variants

You will be unable to save a conversation if you have variants with the same content. Checks exist to ensure all the variants are different. 

You cannot run an A/B test with Duplicate A/B variants. 

You can also set up conditions to show a conversation based on a specific condition. You can set up conditions from Options (ellipsis icon) → Add condition. 

Setup trigger conditions

Setup conditions that dictate when the greeting message is shown

Once you decide you are ready to start an A/B test, save and start a conversation. The A/B test will show the date range and the number of days elapsed in the test duration. 

Save and start an experiment.

An active A/B test is shown below. It will detail information such as the total number of visits, number of views, and engagement rate.

Note: Engagement rate is defined as the number of clicks upon the number of views (measured in percentage)

Running A/B test

An active A/B test with 3 variants

When an A/B test is in progress

You can turn off variants during the execution of an A/B test and continue the test without restarting it. But if you edit a variant or attempt to enable a disabled variant, a warning message is shown.

If any variant is updated or the greeting message step is edited and saved, the A/B test is reset. You can extend or shorten an A/B test, but this will reset the A/B test. 

Viewing an A/B test report

At the end of the A/B test experiment, an A/B test report is generated, and the best performing greeting message will be considered a winner. The other greeting messages will be considered eliminated. The winning greeting message will be made the default greeting message at the end of the experiment. 

Coloring and displaying the variant performance - Active experiment

  • The top-performing variant will be shown in green color.
  • The lowest-performing will be shown in red color.
  • The other two variants in the middle will be shown in yellow.

How is a successful variant identified at the end of an A/B Test?

A successful A/B test outcome is identified from the 'Engagement rate' of the greeting message variants at the end of the A/B test. The greeting message with the highest engagement rate is usually declared the winner. When A/B variants have equal visitor engagement rates, a priority rule order exists, which dictates which variant is declared the winner of the A/B test. 

Rules are evaluated to identify the winner of an A/B test and to resolve ties between two variants as:

  • Priority 1: The highest engagement rate is declared the winner. 
  • Priority 2: The variant with the highest number of chat widget clicks is declared the winner.
  • Priority 3: If variants have the same number of clicks, then the test is considered inconclusive, and all variants are assigned the ‘inconclusive’ flag.

Note: Priority rule execution 

If the variants in the A/B test have an equal engagement rate at the end of the test, then the rule in the second priority level is executed, if this also results in a tie, then the third priority rule is executed.  

if a greeting message A/B test is inconclusive, the first variant in greeting message order will be the default variant selected. 

At the end of the experiment, if the engagement rate is zero, the test will be declared inconclusive. The first variant in the greeting message order will be the default variant selected at the end of the A/B test.

Quick Recap

  • Greeting message A/B test allows you to test and improve your greeting message effectiveness statistically.
  • Insent allows you to test greeting messages over a minimum duration of the day.
  • The outcome of a successful A/B test will be a greeting message that caters effectively to an audience.
  • You can set up a greeting message A/B test for any conversation. We recommend you run A/B tests for pages with significant traffic. 

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us