[Botframework]: How to Capture/Extract the Values Submitted Through Adaptive Card Rendered in C# Web Chat Bot in a Waterfall Dialog

[BotFramework]: How to capture/extract the values submitted through Adaptive card rendered in C# Web Chat bot in a waterfall dialog?

Using Adaptive Cards with Waterfall Dialogs

Natively, Adaptive Cards don't work like prompts. With a prompt, the prompt will display and wait for user input before continuing. But with Adaptive Cards (even if it contains an input box and a submit button), there is no code in an Adaptive Card that will cause a Waterfall Dialog to wait for user input before continuing the dialog.

So, if you're using an Adaptive Card that takes user input, you generally want to handle whatever the user submits outside of the context of a Waterfall Dialog.

That being said, if you want to use an Adaptive Card as part of a Waterfall Dialog, there is a workaround. Basically, you:

  1. Display the Adaptive Card
  2. Display a Text Prompt
  3. Convert the user's Adaptive Card input into the input of a Text Prompt

In your Waterfall Dialog class (steps 1 and 2):

    private async Task<DialogTurnResult> DisplayCardAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)
{
// Display the Adaptive Card
var cardPath = Path.Combine(".", "AdaptiveCard.json");
var cardJson = File.ReadAllText(cardPath);
var cardAttachment = new Attachment()
{
ContentType = "application/vnd.microsoft.card.adaptive",
Content = JsonConvert.DeserializeObject(cardJson),
};
var message = MessageFactory.Text("");
message.Attachments = new List<Attachment>() { cardAttachment };
await stepContext.Context.SendActivityAsync(message, cancellationToken);

// Create the text prompt
var opts = new PromptOptions
{
Prompt = new Activity
{
Type = ActivityTypes.Message,
Text = "waiting for user input...", // You can comment this out if you don't want to display any text. Still works.
}
};

// Display a Text Prompt and wait for input
return await stepContext.PromptAsync(nameof(TextPrompt), opts);
}

private async Task<DialogTurnResult> HandleResponseAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)
{
// Do something with step.result
// Adaptive Card submissions are objects, so you likely need to JObject.Parse(step.result)
await stepContext.Context.SendActivityAsync($"INPUT: {stepContext.Result}");
return await stepContext.NextAsync();
}

In your main bot class (<your-bot>.cs), under OnTurnAsync(), near the beginning of the method, somewhere before await dialogContext.ContinueDialogAsync(cancellationToken) is called (step 3):

var activity = turnContext.Activity;

if (string.IsNullOrWhiteSpace(activity.Text) && activity.Value != null)
{
activity.Text = JsonConvert.SerializeObject(activity.Value);
}

Additional Context

Adaptive Cards send their Submit results a little different than regular user text. When a user types in the chat and sends a normal message, it ends up in Context.Activity.Text. When a user fills out an input on an Adaptive Card, it ends up in Context.Activity.Value, which is an object where the key names are the id in your card and the values are the field values in the adaptive card.

For example, the json:

{
"type": "AdaptiveCard",
"body": [
{
"type": "TextBlock",
"text": "Test Adaptive Card"
},
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"text": "Text:"
}
],
"width": 20
},
{
"type": "Column",
"items": [
{
"type": "Input.Text",
"id": "userText",
"placeholder": "Enter Some Text"
}
],
"width": 80
}
]
}
],
"actions": [
{
"type": "Action.Submit",
"title": "Submit"
}
],
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.0"
}

.. creates a card that looks like:

Test Adaptive Card

If a user enters "Testing Testing 123" in the text box and hits Submit, Context.Activity will look something like:

{ type: 'message',
value: { userText: 'Testing Testing 123' },
from: { id: 'xxxxxxxx-05d4-478a-9daa-9b18c79bb66b', name: 'User' },
locale: '',
channelData: { postback: true },
channelId: 'emulator',
conversation: { id: 'xxxxxxxx-182b-11e9-be61-091ac0e3a4ac|livechat' },
id: 'xxxxxxxx-182b-11e9-ad8e-63b45e3ebfa7',
localTimestamp: 2019-01-14T18:39:21.000Z,
recipient: { id: '1', name: 'Bot', role: 'bot' },
timestamp: 2019-01-14T18:39:21.773Z,
serviceUrl: 'http://localhost:58453' }

The user submission can be seen in Context.Activity.Value.userText.

Note that adaptive card submissions are sent as a postBack, which means that the submission data doesn't appear in the chat window as part of the conversation--it stays on the Adaptive Card.

Capture values submitted by Adaptive Card in waterfall dialog

My issue turned out to be two-fold


1) Inside my OnTurnAsync method in my DialogBot file I had:

var postbackActivity = dc.Context.Activity;
string text = JsonConvert.DeserializeObject<DialogValueDto>(postbackActivity.Value.ToString())?.UserInput;

postbackActivity.Text = text;
await dc.Context.SendActivityAsync(postbackActivity);

I was setting the Text property of postBackActivity variable instead of directly setting the Text property directly on dc.Context.Activity. Because I was sending the the variable through SendActivityAsync it was covering up this mistake because I was getting the value I wanted passed through to the OnEventAsync method in my MainDialog class.

The correct way was to set this directly on the context, not on a copy of it (DOH!)

dc.Context.Activity.Text = text

2) Inside the OnEventAsync method in my MainDialog class I had an empty block which caught the response but did nothing with it (it needed to call await dc.ContinueDialogAsync()). However, this was already handled by an existing block of code in the Virtual Assistant template which my empty block was preventing from being hit.

object value = dc.Context.Activity.Value;

if (condition)
{
// do nothing
}
else if (value.GetType() == typeof(JObject))
{
// code from the Virtual Assistant template to check the values passed through
var submit = JObject.Parse(value.ToString());

// more template code

// Template code
if (forward)
{
var result = await dc.ContinueDialogAsync();

if (result.Status == DialogTurnStatus.Complete)
{
await CompleteAsync(dc);
}
}
}

Once I removed my empty if block then it fell through to the code it needed (the forward part).


Change list:

DynamicWaterfallDialog:

public DynamicWaterfallDialog(
...
)
: base(nameof(DynamicWaterfallDialog))
{
...

InitialDialogId = nameof(WaterfallDialog);

var waterfallSteps = new WaterfallStep[]
{
UserInputStepAsync,
LoopStepAsync,
};

AddDialog(new TextPrompt(nameof(TextPrompt)));
AddDialog(new WaterfallDialog(InitialDialogId, waterfallSteps));
}

DialogBot:

public override async Task OnTurnAsync(ITurnContext turnContext, CancellationToken cancellationToken)
{
...

var dc = await _dialogs.CreateContextAsync(turnContext);

if (dc.Context.Activity.Type == ActivityTypes.Message)
{
// Ensure that message is a postBack (like a submission from Adaptive Cards)
if (dc.Context.Activity.GetType().GetProperty("ChannelData") != null)
{
JObject channelData = JObject.Parse(dc.Context.Activity.ChannelData.ToString());
Activity postbackActivity = dc.Context.Activity;

if (channelData.ContainsKey("postBack") && postbackActivity.Value != null)
{
DialogValueDto dialogValueDto = JsonConvert.DeserializeObject<DialogValueDto>(postbackActivity.Value.ToString());

// Only set the text property for adaptive cards because the value we want, and the value that the user submits comes through the
// on the Value property for adaptive cards, instead of the text property like everything else
if (DialogValueDtoExtensions.IsValidDialogValueDto(dialogValueDto) && dialogValueDto.CardType == CardTypeEnum.Adaptive)
{
// Convert the user's Adaptive Card input into the input of a Text Prompt, must be sent as a string
dc.Context.Activity.Text = JsonConvert.SerializeObject(dialogValueDto);

// We don't need to post the text as per https://stackoverflow.com/a/56010355/5209435 because this is automatically handled inside the
// OnEventAsync method of MainDialog.cs
}
}
}
}

if (dc.ActiveDialog != null)
{
var result = await dc.ContinueDialogAsync();
}
else
{
await dc.BeginDialogAsync(typeof(T).Name);
}
}

MainDialog:

protected override async Task OnEventAsync(DialogContext dc, CancellationToken cancellationToken = default(CancellationToken))
{
object value = dc.Context.Activity.Value;

if (value.GetType() == typeof(JObject))
{
var submit = JObject.Parse(value.ToString());
if (value != null)
{
// Null propagation here is to handle things like dynamic adaptive cards that submit objects
string action = submit["action"]?.ToString();

...
}

var forward = true;
var ev = dc.Context.Activity.AsEventActivity();

// Null propagation here is to handle things like dynamic adaptive cards that may not convert into an EventActivity
if (!string.IsNullOrWhiteSpace(ev?.Name))
{
...
}

if (forward)
{
var result = await dc.ContinueDialogAsync();

if (result.Status == DialogTurnStatus.Complete)
{
await CompleteAsync(dc);
}
}
}
}

I guess I was expecting having the Text property set on the context to automatically fire through to my LoopStepAsync (DynamicWaterfallDialog) handler rather than falling into OnEventAsync (MainDialog). I knew I needed to call ContinueDialogAsync somewhere and should have been more suspicious of the final paragraph of my question:

Interestingly enough my OnEventAsync function of my MainDialog (the one which is wired up in Startup.cs via services.AddTransient>();) gets fired when I set the text property of the activity.

So close, yet so far. Hopefully this helps someone else out in the future.

Link that I found helpful were:

  • ComplexDialogBot.cs.
  • Question about adaptive cards and waterfalls.
  • GitHub issue about Adaptive Cards and prompts.

I am using Bot framework V4.3, I want to retrieve adaptive card submit values

Dealing with the Re-Prompt

The issue is with your OnTurnAsync() method:

 if (turnContext.Activity.Type == ActivityTypes.Message)
{
await Dialog.Run(turnContext, ConversationState.CreateProperty<DialogState>(nameof(DialogState)), cancellationToken);
}

Every time a user sends a message, it causes a new instance of your dialog to be run. Since Adaptive Card Input gets sent as a PostBack message (which is still a message), it causes the Dialog to run again, re-prompting the user.

If you're going to run dialogs from OnTurnAsync() or OnMessageAsync(), there's a couple of different things you should do, either:

  1. Use if/switch statements. For example, if the message contains "help", run the HelpDialog, or

  2. Start a dialog that saves user responses and skips steps as necessary. You can see an example of this in Core Bot's Booking Dialog. Notice how it's saving the user response in each step with something like bookingDetails.TravelDate = (string)stepContext.Result; and checks to see if it exists in the previous step before prompting with something like if (bookingDetails.TravelDate == null). For yours, you might store something like userProfile.AdaptiveCardDetails or something.

Back Button

To get the back button working, let's say it looks like this in your Adaptive Card:

{
"type": "Action.Submit",
"title": "Back",
"data": {
"goBack": "true",
}
},

When the user clicks "Back", the bot will receive an activity with:

Sample Image

Since the user wants to go back and you don't need the data, you could do something like:

var activity = turnContext.Activity;

if (string.IsNullOrWhiteSpace(activity.Text) && activity.Value.GetType().GetProperty("goBack"))
{
dc.Context.Activity.Text = "Back";
}

and then in your Dialog step:

if (stepContext.Result == "Back")
{
stepContext.ActiveDialog.State["stepIndex"] = (int)stepContext.ActiveDialog.State["stepIndex"] - 2;
}

Adaptive Card response from a WaterfallStep Dialog MS Bot framework v4

After digging for some way forward I came across:

Issue#614

Thus to make adaptive card response work from Dialog, I made a compatible adaptive card prompt by one modification each in Prompt.cs and TextPrompt.cs from Microsoft bot framework.

Prompt.cs => Prompt2.cs ;
TextPrompt.cs => CustomPrompt.cs

Prompt2.cs :

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Bot.Builder.Dialogs.Choices;
using Microsoft.Bot.Schema;
using Newtonsoft.Json;

namespace Microsoft.Bot.Builder.Dialogs
{
//Reference: Prompt.cs
/// <summary>
/// Basic configuration options supported by all prompts.
/// </summary>
/// <typeparam name="T">The type of the <see cref="Prompt{T}"/>.</typeparam>
public abstract class Prompt2<T> : Dialog
{
private const string PersistedOptions = "options";
private const string PersistedState = "state";

private readonly PromptValidator<T> _validator;

public Prompt2(string dialogId, PromptValidator<T> validator = null)
: base(dialogId)
{
_validator = validator;
}

public override async Task<DialogTurnResult> BeginDialogAsync(DialogContext dc, object options, CancellationToken cancellationToken = default(CancellationToken))
{
if (dc == null)
{
throw new ArgumentNullException(nameof(dc));
}

if (!(options is PromptOptions))
{
throw new ArgumentOutOfRangeException(nameof(options), "Prompt options are required for Prompt dialogs");
}

// Ensure prompts have input hint set
var opt = (PromptOptions)options;
if (opt.Prompt != null && string.IsNullOrEmpty(opt.Prompt.InputHint))
{
opt.Prompt.InputHint = InputHints.ExpectingInput;
}

if (opt.RetryPrompt != null && string.IsNullOrEmpty(opt.RetryPrompt.InputHint))
{
opt.RetryPrompt.InputHint = InputHints.ExpectingInput;
}

// Initialize prompt state
var state = dc.ActiveDialog.State;
state[PersistedOptions] = opt;
state[PersistedState] = new Dictionary<string, object>();

// Send initial prompt
await OnPromptAsync(dc.Context, (IDictionary<string, object>)state[PersistedState], (PromptOptions)state[PersistedOptions], false, cancellationToken).ConfigureAwait(false);

// Customization starts here for AdaptiveCard Response:
/* Reason for removing the adaptive card attachments after prompting it to user,
* from the stat as there is no implicit support for adaptive card attachments.
* keeping the attachment will cause an exception : Newtonsoft.Json.JsonReaderException: Error reading JArray from JsonReader. Current JsonReader item is not an array: StartObject. Path ‘[‘BotAccessors.DialogState’].DialogStack.$values[0].State.options.Prompt.attachments.$values[0].content.body’.
*/
var option = state[PersistedOptions] as PromptOptions;
option.Prompt.Attachments = null;
/* Customization ends here */

return Dialog.EndOfTurn;
}

public override async Task<DialogTurnResult> ContinueDialogAsync(DialogContext dc, CancellationToken cancellationToken = default(CancellationToken))
{
if (dc == null)
{
throw new ArgumentNullException(nameof(dc));
}

// Don't do anything for non-message activities
if (dc.Context.Activity.Type != ActivityTypes.Message)
{
return Dialog.EndOfTurn;
}

// Perform base recognition
var instance = dc.ActiveDialog;
var state = (IDictionary<string, object>)instance.State[PersistedState];
var options = (PromptOptions)instance.State[PersistedOptions];

var recognized = await OnRecognizeAsync(dc.Context, state, options, cancellationToken).ConfigureAwait(false);

// Validate the return value
var isValid = false;
if (_validator != null)
{
}
else if (recognized.Succeeded)
{
isValid = true;
}

// Return recognized value or re-prompt
if (isValid)
{
return await dc.EndDialogAsync(recognized.Value).ConfigureAwait(false);
}
else
{
if (!dc.Context.Responded)
{
await OnPromptAsync(dc.Context, state, options, true).ConfigureAwait(false);
}

return Dialog.EndOfTurn;
}
}

public override async Task<DialogTurnResult> ResumeDialogAsync(DialogContext dc, DialogReason reason, object result = null, CancellationToken cancellationToken = default(CancellationToken))
{
// Prompts are typically leaf nodes on the stack but the dev is free to push other dialogs
// on top of the stack which will result in the prompt receiving an unexpected call to
// dialogResume() when the pushed on dialog ends.
// To avoid the prompt prematurely ending we need to implement this method and
// simply re-prompt the user.
await RepromptDialogAsync(dc.Context, dc.ActiveDialog).ConfigureAwait(false);
return Dialog.EndOfTurn;
}

public override async Task RepromptDialogAsync(ITurnContext turnContext, DialogInstance instance, CancellationToken cancellationToken = default(CancellationToken))
{
var state = (IDictionary<string, object>)instance.State[PersistedState];
var options = (PromptOptions)instance.State[PersistedOptions];
await OnPromptAsync(turnContext, state, options, false).ConfigureAwait(false);
}

protected abstract Task OnPromptAsync(ITurnContext turnContext, IDictionary<string, object> state, PromptOptions options, bool isRetry, CancellationToken cancellationToken = default(CancellationToken));

protected abstract Task<PromptRecognizerResult<T>> OnRecognizeAsync(ITurnContext turnContext, IDictionary<string, object> state, PromptOptions options, CancellationToken cancellationToken = default(CancellationToken));

protected IMessageActivity AppendChoices(IMessageActivity prompt, string channelId, IList<Choice> choices, ListStyle style, ChoiceFactoryOptions options = null, CancellationToken cancellationToken = default(CancellationToken))
{
// Get base prompt text (if any)
var text = prompt != null && !string.IsNullOrEmpty(prompt.Text) ? prompt.Text : string.Empty;

// Create temporary msg
IMessageActivity msg;
switch (style)
{
case ListStyle.Inline:
msg = ChoiceFactory.Inline(choices, text, null, options);
break;

case ListStyle.List:
msg = ChoiceFactory.List(choices, text, null, options);
break;

case ListStyle.SuggestedAction:
msg = ChoiceFactory.SuggestedAction(choices, text);
break;

case ListStyle.None:
msg = Activity.CreateMessageActivity();
msg.Text = text;
break;

default:
msg = ChoiceFactory.ForChannel(channelId, choices, text, null, options);
break;
}

// Update prompt with text and actions
if (prompt != null)
{
// clone the prompt the set in the options (note ActivityEx has Properties so this is the safest mechanism)
prompt = JsonConvert.DeserializeObject<Activity>(JsonConvert.SerializeObject(prompt));

prompt.Text = msg.Text;
if (msg.SuggestedActions != null && msg.SuggestedActions.Actions != null && msg.SuggestedActions.Actions.Count > 0)
{
prompt.SuggestedActions = msg.SuggestedActions;
}

return prompt;
}
else
{
msg.InputHint = InputHints.ExpectingInput;
return msg;
}
}
}
}

CustomPrompt.cs :

using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Bot.Builder;
using Microsoft.Bot.Builder.Dialogs;
using Microsoft.Bot.Schema;

namespace HotelBot
{
//Reference: TextPrompt.cs
public class CustomPrompt : Prompt2<string>
{
public CustomPrompt(string dialogId, PromptValidator<string> validator = null)
: base(dialogId, validator)
{
}

protected async override Task OnPromptAsync(ITurnContext turnContext, IDictionary<string, object> state, PromptOptions options, bool isRetry, CancellationToken cancellationToken = default(CancellationToken))
{
if (turnContext == null)
{
throw new ArgumentNullException(nameof(turnContext));
}

if (options == null)
{
throw new ArgumentNullException(nameof(options));
}

if (isRetry && options.RetryPrompt != null)
{
await turnContext.SendActivityAsync(options.RetryPrompt, cancellationToken).ConfigureAwait(false);
}
else if (options.Prompt != null)
{
await turnContext.SendActivityAsync(options.Prompt, cancellationToken).ConfigureAwait(false);
}
}

protected override Task<PromptRecognizerResult<string>> OnRecognizeAsync(ITurnContext turnContext, IDictionary<string, object> state, PromptOptions options, CancellationToken cancellationToken = default(CancellationToken))
{
if (turnContext == null)
{
throw new ArgumentNullException(nameof(turnContext));
}

var result = new PromptRecognizerResult<string>();
if (turnContext.Activity.Type == ActivityTypes.Message)
{
var message = turnContext.Activity.AsMessageActivity();
if (!string.IsNullOrEmpty(message.Text))
{
result.Succeeded = true;
result.Value = message.Text;
}
/*Add handling for Value from adaptive card*/
else if (message.Value != null)
{
result.Succeeded = true;
result.Value = message.Value.ToString();
}
}

return Task.FromResult(result);
}
}
}

Thus workaround until official release of Adaptive Card Prompt for dialog in V4 botframework, is to use this custom prompt.

Usage: (Only for sending adaptive cards which have submit actions)

Referring to the example in the question section:

Add(new CustomPrompt("testPrompt"));

The response for the adaptive card submit action will be received in the next waterfall step : ProcessInputAsync()

var choice = (string)stepContext.Result;

choice will be JSON string of the body posted by the adaptive card.

How to retrieve Adaptive Card's form submission in subsequent waterfall step

Adaptive Cards send their Submit results a little different than regular user text. When a user types in the chat and sends a normal message, it ends up in Context.Activity.Text. When a user fills out an input on an Adaptive Card, it ends up in Context.Activity.Value, which is an object where the key names are the id in your card and the values are the field values in the adaptive card.

For example, the json:

{
"type": "AdaptiveCard",
"body": [
{
"type": "TextBlock",
"text": "Test Adaptive Card"
},
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"text": "Text:"
}
],
"width": 20
},
{
"type": "Column",
"items": [
{
"type": "Input.Text",
"id": "userText",
"placeholder": "Enter Some Text"
}
],
"width": 80
}
]
}
],
"actions": [
{
"type": "Action.Submit",
"title": "Submit"
}
],
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.0"
}

.. creates a card that looks like:

Test Adaptive Card

If a user enters "Testing Testing 123" in the text box and hits Submit, Context.Activity will look something like:

{ type: 'message',
value: { userText: 'Testing Testing 123' },
from: { id: 'xxxxxxxx-05d4-478a-9daa-9b18c79bb66b', name: 'User' },
locale: '',
channelData: { postback: true },
channelId: 'emulator',
conversation: { id: 'xxxxxxxx-182b-11e9-be61-091ac0e3a4ac|livechat' },
id: 'xxxxxxxx-182b-11e9-ad8e-63b45e3ebfa7',
localTimestamp: 2019-01-14T18:39:21.000Z,
recipient: { id: '1', name: 'Bot', role: 'bot' },
timestamp: 2019-01-14T18:39:21.773Z,
serviceUrl: 'http://localhost:58453' }

The user submission can be seen in Context.Activity.Value.userText.

Note that adaptive card submissions are sent as a postBack, which means that the submission data doesn't appear in the chat window as part of the conversation--it stays on the Adaptive Card.

Using Adaptive Cards with Waterfall Dialogs

Natively, Adaptive Cards don't work like prompts. With a prompt, the prompt will display and wait for user input before continuing. But with Adaptive Cards (even if it contains an input box and a submit button), there is no code in an Adaptive Card that will cause a Waterfall Dialog to wait for user input before continuing the dialog.

So, if you're using an Adaptive Card that takes user input, you generally want to handle whatever the user submits outside of the context of a Waterfall Dialog.

That being said, if you want to use an Adaptive Card as part of a Waterfall Dialog, there is a workaround. Basically, you:

  1. Display the Adaptive Card
  2. Display a Text Prompt
  3. Convert the user's Adaptive Card input into the input of a Text Prompt

In your Waterfall Dialog class (steps 1 and 2):



Related Topics



Leave a reply



Submit