Form processing in AI Builder

Form processing

Form processing identifies the structure of your documents based on examples you provide to extract text from any matching form. Examples might include tax forms or invoices.

In this lab we will build and train a model for recognizing invoices. Then we will build a tablet app to show the detection in action and digitize the content.

Note: If you are building the first model in an environment, click on Explore Templates to get started.

 

Exercise 1

  1. From the left navigation expand AI Builder and select Build. Select Form Processing.

  1. Name your model. Because you are working in a shared environment make sure to include your name as part of the model name. This will make it easier to find later. Click create.

  1. Your screen should look like the following image. Select Add documents.

 

  1. Add the documents from the Train folder. You must have at least five documents to train the model.
  2. Confirm the selection and click Upload.

  3. Once your uploads are complete, select Analyze.

 

  1. Select the fields.

  2. Hover over the highlighted fields and confirm the fields that should be returned by the form when processing from our trained model.

  3. Once you have confirmed the fields, click Done.

  1. Train your model.

  1. Locate and open your saved model. If you need help finding it, type your name into the search box.

  1. Review the results of the trained model.

  1. Perform a test with the test invoice.
  2. Perform a test with another image or document.

  1. Publish the model.

 

 

Exercise 2

 

  1. Navigate to Apps and create a new Canvas App. Select Blank app with a tablet layout.

  1. Insert the Form Processor control from the AI Builder.

  1. Map it to your saved model.

  1. Drag and resize the control like the image below.

  1. Play your app.

  2. Click Analyze and add your test file.

  3. Your uploaded form will be analyzed

 

  1. You can see the mapped fields are recognized.

  1. Close the app player.

  2. Let’s take some of the data fields and place them on the screen for the user to review. Add three labels to the screen. Drag them to the right side of the screen and line them up like in the image below. Edit the text to “Invoice Number” , “Due Date” , and “Total”.

  3. Add Text input fields for each row and place them as below.

  4. Now we will map data from the analyzed document. Edit the default values for each field as follows:

     

Invoice Number

FormProcessor1.FormContent.Fields.INVOICE

Due Date

FormProcessor1.FormContent.Fields.’Due Date’

Total

FormProcessor1.FormContent.Fields.Total

 

  1. Play the app and add an invoice to be analyzed.

Creating a Power Virtual Agent Bot

If you haven’t seen the news, the Power Platform has a new member of the family: Power Virtual Agents!

Power Virtual Agents, provide exceptional support to customers and employees with AI-driven virtual agents. Easily create and maintain bots with a no-code interface.

If you aren’t familiar with bots, a bot is a computer program that conducts a text conversation with your customers to direct them to what they need quickly without requiring your human agents to intervene. Bots are a great way to answer simple, repetitive questions from your customers and to help them do repeatable tasks like find out how to return or exchange an item, join your rewards program, or cancel an order (which you’ll learn how to do in this training). Bots save your agents time (and your company money) by freeing agents to focus on more complex problem-solving and handle more valuable customer interactions.

This blog post is a very introductory walk through on how to get started creating your own bot.

Step 1: Navigate to https://powervirtualagents.microsoft.com/en-us/ and select “Try Preview”

Step 2. Log in to the tenant you want to create your bot in.

This presumes you already have an environment created in your Office Tenant. If you don’t already have a Tenant or environment please see the directions can be found in an App an two hours in the Power Apps training: https://aka.ms/powerappstraining

Step 3. Create a new bot.

Being there is no bot in your tenant, you will get automatically get prompted to create one.

Step 4. Set up the bot options

If you don’t want to create your bot in the default environment, you can set this under “More Options”

Step 5. That is, it! Your bot is built, and it is time to test it!!!!!!!!

To do this turn on tracing

Step 6. Add some text that falls into your greeting phrase.

Step 7. Watch the processing and workflow

Step 8. Customize your bot

Now that you have seen your bot in action, let’s customize it!

The easiest place to start is the greeting button, changed if you toggle tracing back off you see this option.

Step 9. Change your greeting

Step 10. Verify your Updates!

Simply rerun your bot and greet it, the response should now come back with your new greeting!

Congratulations you have just created your own bot with no code and simple configuration changes.

In the next post we will show you how to deploy it!

For more information check out the forum at https://aka.ms/virtualagentforum

Using the Sentiment Analysis Action from PowerApps

 

AI Builder has some amazing features. This walk through will get you started with using their sentiment analysis from PowerApps

  1. Login to PowerApps


  1. Navigate to Solutions

     


 

  1. Create a new Solution

  1. Open your Solution

  1. Add a new Flow

  1. Set the trigger to PowerApps. Note I also named the flow at this step: “Sentiment from PowerApps”

  1. Add the “Predict” action to the Flow.

    Note: if you don’t see the “Predict” action, you are likely NOT in a solution. NOTE this is required!!!!!!!!!!

  1. Set the model to “SentimentAnalysis Model”

    Note: the other AI Builder Models I had created and available to the Flow

  1. Insert the following Text into the Request Payload

    {“text”:”My Text”, “language”:”My Language”}

     

    This from https://docs.microsoft.com/en-us/ai-builder/flow-sentiment-analysisc (watch those evil smart quotes!)

     

 

 

  1. Replace the “My Text” argument with Ask in PowerApps by clicking the “Ask in PowerApps” shape in the bottom of the action

  1. Replace the “My Language” argument with Ask in PowerApps by clicking the “Ask in PowerApps” shape in the bottom of the action

 

  1. Add a new Step and add the Parse JSON action

  1. Specify the content as the Response Payload. Specify the schema as the JSON below.

JSON for the Schema:

 

 

{

“type”: “object”,

“properties”: {

“predictionOutput”: {

“type”: “object”,

“properties”: {

“result”: {

“type”: “object”,

“properties”: {

“sentiment”: {

“type”: “string”,

“title”: “documentSentiment”

},

“documentScores”: {

“type”: “object”,

“properties”: {

“positive”: {

“type”: “number”

},

“neutral”: {

“type”: “number”

},

“negative”: {

“type”: “number”

}

}

},

“sentences”: {

“type”: “array”,

“items”: {

“type”: “object”,

“properties”: {

“sentiment”: {

“type”: “string”

},

“sentenceScores”: {

“type”: “object”,

“properties”: {

“positive”: {

“type”: “number”

},

“neutral”: {

“type”: “number”

},

“negative”: {

“type”: “number”

}

}

},

“offset”: {

“type”: “integer”

},

“length”: {

“type”: “integer”

}

},

“required”: [

“sentiment”,

“sentenceScores”,

“offset”,

“length”

]

}

}

}

}

}

},

“operationStatus”: {

“type”: “string”

},

“error”: {}

}

}

  1. Add a PowerApps Response Action to the Flow

  1. Set the PowerApps Response to return an output a text value that is documentsentiment object from the parse JSON action.

  1. Save flow
  2. Go back to Solutions

  1. Add a new Canvas App. ( in this is a Phone form factor…but it isn’t really that important)

  1. Add a button and a Text Input and a button control

 

  1. Add a label control and set the text equal to mysentiment.sentiment

    PowerApps will complain about this…ignore for the time now.

  1. Select the button you added above and select the menu “Actions”, select Flows then select the Flow you created above.

NOTE: This is currently not working as Flows can not be referenced from a PowerApps in a Solution.

It is working in our staging environment and should be working soon! (Where the screen shots were taken)

  1. And here it is running!

Note the AI action gives many sentiment heuristics such as the scores for each sentiment type

AI Builder Object detection Hands On Lab

 

Download the latest version of this lab here:

https://github.com/microsoft/PowerApps-Samples/blob/master/ai-builder/labs/AIBuilder_Lab.zip

 

 

Object detection

Object detection lets you count, locate, and identify selected objects within any image. You can use this model in PowerApps to extract information from pictures you take with the camera, or load into an app.

In this lab, we will build and train a detection model and build an app that uses the detection model to identify objects from available images.

To get started with AI Builder

Go to PowerApps.com and sign in

 

Go to solutions and import the AI_Builder Sample

 

Navigate to AI Builder

 

 

Note: If you are building the first model in an environment, click on Explore Templates to get started.

 

 

Exercise 1

In this exercise we will build and train the Object Detection model for three varieties of tea.

  1. In PowerApps maker, expand AI Builder and select Build. Select Object Detection.

 

  1. Name your model Green Tea Product Detection and because you are working in a shared environment also make sure to include your name as part of the model name. This will make it easier to find later. Click create.

  1. Your screen should now look like the image here.

  1. Notice the progress indicator on the left. Those are the steps we will follow now to build and train our model.

  1. We are now going to define the objects we are tracking. Click on the Select object names.

 

  1. From the entity list, select Object Detection Product.

 

  1. Select the Name field and click Select field.

 

NOTE: As the solution import didn’t bring in this data you can enter it in via Data

  1. Select the tea items and click Next.

 

  1. Notice the progress indicator has moved forward to the Add images step.

  1. Click add images.

  1. Select images from the set provided. You will need enough images to provide 15 samples for each type of tea we are tracking.

  1. Approve the upload of images. Click Upload images. After the upload completes, click Close.

 

  1. Click next to begin tagging the images.

  1. Select the first image to begin tagging.

  1. Hover over the image, near an item you wish to tag. A dotted-lined box should appear around the item. It has been detected as a single item that can be tagged.

  1. Click on the item and select the matching object name.

  1. If the pre-defined selector is not accurate, as in the below example, you can drag the container to draw it to accurately tag the item.

  1. Do this for each item in the image and for each image in your set. When you have tagged all of the images you uploaded click Done Tagging in the top right of the screen.

  1. Once you have completed tagging, you will get a summary of the tags. If you haven’t tagged enough for analysis, you will need to load and tag more examples.

 

  1. Once you have defined enough tags for training the model, you will be allowed to initiate the training. Click Next.

  1. Click Train.

  1. The training takes a few moments.

  1. Navigate to the saved model view and confirm your model has completed training.

  1. Select the model you just made.

  1. Select Quick test.

 

  1. Upload or drag and drop one of your test images to be analyzed.

     

  2. You will see the analysis and level of confidence for the match.

 

  1. Upload an image you know will not match. You will see the analysis and level of confidence for the match.

  1. Click close.
  2. Publish your model.

 

Exercise 2

We will now create a canvas app you can use for detecting the items that have been trained in our model. The product will be detected from the image and you will be able to adjust on-hand inventory for the item.

  1. Navigate to Apps, and select Create an app, then select Canvas. If asked, grant permission for the app to use your active CDS credentials.

  1. Select Blank app with Phone layout.

  1. On the maker canvas, select the Insert tab in the ribbon and expand AI Builder. Select Object detector to place this control on your app.

  1. Select the AI model you built.

  1. Resize the control to better use the space.

  1. Make sure to leave room for more items we will be placing soon.

  1. Play your app.

  1. Click on Detect.

  1. Choose one of your test images and click Open.

  1. The image will now be analyzed.

  1. Our model has detected each tea in the image.

  1. Exit the app player.

 

Bonus exercise- build out the data in your canvas app

 

  1. We will now select our data source. Select View from the ribbon and select Data Sources.

  1. Click + Add Data Source.

  1. Add the Common Data Service data source. Do not use Common Data Service (current environment).

  1. Select the Object Detection Products entity and click Connect.

  1. Close the Data pane.
  2. With Screen1 selected in the Tree view, navigate to the Insert ribbon tab, expand Gallery and select Blank vertical gallery.

  1. Rename the Gallery productGallery. You are re-naming the gallery so you can reference it from your formulas.

  1. Resize and move the gallery control to fit the available space on the screen, leaving some space at the bottom for using later.

  1. Select the edit icon from the gallery.

  1. Add a label to the gallery.

  1. Click edit again and add a Text input box to the gallery. Resize and place it to line up with the label we’ve already placed. We will be updating inventory counts in this text box.

  1. Rename the Text Input inventoryInput. You are renaming this control so you can reference it from your formulas.

  1. With focus on the Screen1 in the Tree view, click in the ribbon on Insert and select Button.

  1. Drag and move the button to the bottom of the screen, double click on it to edit the text. Rename it to Update.

  1. We will now add the user message to give the user confirmation their submission was accepted; we will define this logic later. With focus on Screen1, insert a label, drag it to the bottom of the screen.

 

  1. We will now add logic to the controls we’ve placed on the screen. Select the Gallery and replace the Items formula with the following.

    ‘Object Detection Products’

  1. Select the label in your gallery. Replace the Text formula with the following:

    ThisItem.Name

  1. Select inventoryInput and replace the formula for Default with the following:

    LookUp(‘Object Detection Products’,Name = ThisItem.Name).’Inventory Total’

  1. Select the other label (the one that shows at the bottom of the screen) and replace its text with the following:

    usermessage

  1. You’ll notice that area now looks blank. We will configure that message in our next step.

  1. Select the button control and replace the OnSelect with the following:

    ForAll(productGallery.AllItems,Patch(‘Object Detection Products’,LookUp(‘Object Detection Products’,Name=DisplayName),{‘Inventory Total’:Value(inventoryInput.Text)}));Set(usermessage,”Updated ” & CountRows(productGallery.AllItems) & ” items”)

  1. Play the app again.
  2. Click Detect.

  1. Select an image to evaluate.

  1. Update the quantity for the correct product and click Update.

  1. The bottom should show a message now.

 

 

 

Creating dynamic Power BI tiles inside of PowerApps

Related demo can be found here: 

https://powerusers.microsoft.com/t5/PowerApps-Community-Blog/Integrating-PowerApps-Power-BI-and-Flow…

 
 

The second walk through below: 

PowerApps is the perfect complement to Power BI in that it makes it very easy to update or add data to the underlying datasets that Power BI is Visualizing. 

This walk through is using the Power BI sample data.

Step 1. Create a Dashboard using the Power BI Samples by selecting “Get Data” > Samples


2. Select the Sample “Retail Analysis”


Step 3.  Create a Canvas based PowerApp


Step 4. Insert a Power BI Control

From the insert ribbon choose controls and scroll to the Power BI Tile


Step 5. Setup the Power BI Control


 

 

Step 6.  Add some buttons to set the filter context

On the Insert ribbon select controls and add a couple (~three) buttons. 


 

Step 7. Copy the TILE URL Text

 


 

Step 8.  Create and set a variable to set our visual ….and our filter context

In the OnSelect of each button set a variable to the tile URL property

 


 

 

Step 9. Set the TileURL property of the Power BI Control

Select the Power BI control and scroll to the TileURL property then set the property to MyVar

 
 


 

Step 10 Copy this Text to the second button’s on select and add the text:

&$filter=Store/Territory eq ‘NC’

To the end

 

Step 11 Test the application

 

PowerApps to update and refresh data in Power BI Hands On Lab

 

Walking through how to refresh datasets and create near real time reports in Power BI with the PowerApps Custom Visual.

Starting from Power BI we need a report that is based on Direct Query. To save time you can download this report from here:

https://community.powerbi.com/oxcrx34285/attachments/oxcrx34285/DataStoriesGallery/3027/2/PowerApps_with_refresh.pbix

 

Upload the report to Power BI.com

 

  1. Please log into Power BI https://PowerBI.com using the supplied accounts i.e.

PAUser1@powermvps.comPAUser108@powermvps.com

Pass@word1

To save some steps the directions will just show doing this from “My Workspace” ….please remember “my workspace” is not really a recommended location.

  1. Select “Get Data” and then Files


  1. Select Local File and upload the PowerApps_with_refresh.pbix file from step #1

 

https://community.powerbi.com/oxcrx34285/attachments/oxcrx34285/DataStoriesGallery/3027/2/PowerApps_with_refresh.pbix

 

 

  1. Update the security for the report connection back to the SQL Server by clicking on the three dots after PowerApps_with_refresh Dataset.

    IF you don’t see the Datasets click on “My Workspace” to expand out the artifacts in the workspace. …this may require a couple of attempts!

 

IF you don’t see the Datasets click on “My Workspace” to expand out the artifacts in the workspace. …this may require a couple of attempts!

 

  1. To update the connection credentials, click on dataset settings, click on the “PoewrApps_with_refresh dataset and then select “Edit Credentials”.

 

  1. To configure the credentials to use the

    basic authentication method with the Username:

    Ann and password provided.

    Pass@word1

  1. Test the report that it is working!

     

 

  1. Edit the report to add the PowerApps Custom Visual

     

 

  1. Add the PowerApps Custom Visual to the report

 

  1. Add the custom visual to the report canvas

  1. Add all the fields to the Visual and specify we are building a new application and note the environment

  1. In the new form wizard specify to add a form

  1. Delete the default Gallery as it won’t be used. (note it is already populated with data from Power BI)

 

  1. Specify a Data source for the PowerApps form.

17. Specify the data source is a SQL Server source

18. Specify to use a new connection

 

19. Create a new SQL Connection

Supply the following Data for the connection String

Server: tcp:powerplatformdemos.database.windows.net

User: Ann

Password: See front of class

 

 

 

 

 

20. Specify to use the “Orders” Table

 

21. Select the fields property for the form.

 

22. Specify to use all the fields

23. Add Two Buttons

24. Set the OnSelect property to:

SubmitForm(Form1) ; PowerBIIntegration.Refresh()

Also Set the Text property to “Submit”

25. Set the onselect property of the first button to:

NewForm(Form1)

 

26. Save your application File > Save > Save (yes two of them!)

27. Test your application!!!!!!!!!!!!!!!!!

Mid-Month PowerApps Video Round Up!

 

While we are only 11 days into the month, we already have an amazing selection of PowerApps videos for your viewing pleasure!

If you haven’t already checked out the Community Gallery be sure and do so!

 


 

Creating Update-able Power BI Reports with PowerApps

 

 


Using PowerApps Visual in Power BI to interact with data

 

 

 


Creating a PowerApps Menu Components

 

 


PowerApps Deep Linking How To

 

 


PowerApps With Function

 

 


Data Validation Demystified – IsMatch()

 

 


Getting Started with the PowerApps Component Framework

 

 


PowerApps In-App Security Trimming – Easy SharePoint Method

 

 


PowerApps Draggable/Moveable Control

 

 


Adding a Signature Line to PowerApps and Saving

 

 


Walk through creating the internal Microsoft PowerApps Tool “Thrive” application with Pat Dunn

 

 

 


Share PowerApps With Guest / External Users

 

 


Microsoft PowerApps – Pass string array to SQL Stored Procedure from PowerApps using Flow

 

 


Shoutouts Template Configuration – Save data in SharePoint

 

 


Connect XrmToolBox To CDS

 

 


PowerApps CDS Security