Vancouver User Group July 22nd Data Flows and a sneak peak at new commanding v2 for model driven Power Apps by Olena Grischenko and Scott Durow


Data Flows in Power Apps by Olena Grischenko  (30 minutes)

No-code integration what? No-code data migration why? There are things that aren’t easy to un-learn in the development world. System integration and data migration are two of the most difficult parts of any business application projects. Can we solve it with No-code? Yes, we can, it a very easy and cost-effective way. You could ask Why? Why do we need new tools in the space where we solve problems in the same common way for many years? This talk is useful for developers as well as for people who don’t use code. Learn about new possibilities of Power Platform which nobody talks about even though they do exist. Learn about simple and not so simple transformations from CSV, text files, SQL to Dataverse. Become a new Power Platform data integration superhero!


Sneak Peak at Commanding for model driven Power Apps by Scott Durow (30 minutes)

Changing the commanding in model based Power Apps has traditionally been reserved for code based solutions, in this session Power Platform MVP and User Group Leader will show us a sneak peak as to how this going to be much easier in the near future.  (sorry this portion is currently not planned to be recorded nor posted due to its unreleased nature)



Olena Grischenko


Technomancy – make systems work like magic: #PowerLabs – start with Microsoft technologies!



Scott Durow


Hi, I’m Scott and I am truly passionate about helping people get the most out of the Microsoft Power Platform


Scott is a committed and personable software architect/technologist with a successful track record for realising business vision through enterprise/application architectures that are tightly aligned with budget and time scales. By combining his detailed technical knowledge with a clear grasp of the wider commercial issues Scott is able to identify and implement practical solutions to real business problems. Scott is also an excellent communicator and technical author.

Selecting a Florida medium class inshore casting reel

Last October I was lucky enough to get a place in Florida on Little Torch Key and spent a lot of my winter chasing grouper, snapper and particularly Tarpon with the gear I brought with me from Washington.

While I caught a lot of fish and some nice juvenile Tarpon while in Florida I came to realize that the gear I brought down wasn’t ideal as it was designed for Pacific Ocean Sea Bass, Ling Cod and Salmon….All with moderate to slow action rods and reals that are designed more for drop and trolling than casting 100’s of times per trip.

For light gear this was easily resolved with getting a 7 fast extension rod for my Azores-4000 Series spinning reel. -This set up with a ¼ jig head* and a 4″ paddle tail is my go-to for practically everything around Little Torch Key.

For larger fish, medium sized tackle and fishing live baits I would trade back and forth from my Okuma Komodo 364 (just an “okay” casting reel) to my Lexa 400 (a little too big for casting those ¼ jig heads) with one of my custom three-piece rods I wrapped….

While this medium sized setup “worked”, due to the size I found myself always gravitating back to the light gear….and ended up hooking enough large sharks and monster Barracuda on that light set up to realize this 4000-class spinning gear was outclassed for a lot of the fish I wanted to be targeting.

Clearly a new Medium-Heavy class setup was called for!

As I wrap my own rods, picking a heavy fast action 7′ – 7’6″ blank and acid wrap it, is an easy solution.

The challenge came to selecting the reel as there a several great choices and not certain how they really line up.

Starting with what I didn’t like in my current gear:

  • Lexa 400 Great casting reel but a little large
  • Komoda 364 Great Size, Great Drag not a great casting reel due to non-disengaging level Wind.

With that information went out and collected all the information about the Shimano and Daiwa reels in this size.

I eliminated all the Okuma’s and Penn reels ‘s as they do not have disengaging level winds and removed ABU Garcia as don’t see many of them in the shops down in the Keys and don’t seem to be as well known for Saltwater.

With that being the case let’s take a look at the characteristics of these reels!





line capacity 14lb

Line Crank



ProRex 400


$ 319.00




Lexa 300


$ 200.00




Lexa 400


$ 249.00




Tatula 300


$ 269.00




Tranx 300


$ 279.00




Tranx 400


$ 299.00




Curado 300


$ 199.00





While weight isn’t really the issue, it is often an indicator for a reel’s size and the ability to be palmed while casting and should give insight into how comfortable it will be for 8+ hours of casting.

While the Lexa 400 is clearly the standout heaviest reel in the collection is was amazing is how similar the rest of these reels were at that 11-12 ounces and it seemed any of these reels would work in terms of size and weight.

Line Capacity

As this is to be my medium-heavy rig it needed the line capacity to let fish run! With most of the reel weights being the same it seemed having a reel with larger line capacity being preferred so either the Prorex or Tranx 400.


Like weight is an indicator of size price can inform you of build quality…Must admit surprised to see the ProRex more expensive than the Tranx!


External Anti-Lash Brakes

Must admit I started this effort to justify buying a Tranx but after I found out Shimano uses an anti lash braking system that forces taking the side cover off leveled the playing field!

In Florida I fish a very small (13′) technical skiff and have found “If something can fall into the water…it will!”



After using a couple of Lexa’s and Curado’s I was confident any of these reels would be a great companion for my slightly too large Lexa 400.

After looking at the data and their price points, I decided on getting a Tatula 300.

While looking for a good price on a Tatula 300 I found a Prorex 400 sitting in SportCo for $230!!!

While not the reel I had decided on; getting the most expensive reel in the roundup for the cheapest price was simply too good a deal to pass up!

Winner: Prorex 400 Daiwa

*One issue I am finding here in the USA is most the ¼ jig heads have small wire hooks….In Australia it is pretty common to have 1/8 and ¼ jig heads with larger 5/0 and 7/0 hooks

Creating Azure Functions, Custom Connectors and using them inside of Power apps

There are over 400 connectors to connect to Power Apps to almost every conceivable service, database or back end. This article is how to connect to end points that don’t have a custom connector or custom purpose-built Azure functions. The documentation does a great job of walking through creating an Azure Function. This walk through will assume you have followed these directions to create an Azure Function -and start from there.

Creating the Azure Function

Step 1. Create a new Function using the Azure Functions Extension.

Select the Azure Functions extension for Visual Studio Code and add a new function using the lighting bolt glyph.

Step 2. Set the function to use HttpTrigger trigger

Step 3. Name the Azure function.

In this example we have named the function “SydneyWeather”

Step 4. Supply the name space name.

In this example we have used the name “Getweather.Function”

Step 5. Set the Permission Access type.

Since this is a demo, set the access to anonymous.

Step 6. Check in the Function to version control.

Step 7. Deploy this function into the Azure Subscription.

In this lab is updating the function created earlier in the lab.

(Not this is for demo purposes only, normally one would not check in directly to a production site)

Step 8. Run the Function interactively in a browser.

In this example the function was deployed to a function with the name “GetTemperature” therefore can be called using the following URL

This returned the following:

    “This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.”


If a query string is passed in like the following:

Will return the following:

    “Hello, Chuck. This HTTP triggered function executed successfully.”


While Azure Functions extension for Visual Studio Code makes it easy to create an HTTP based function the default template return JSON…So these functions this can be called from a Flow in Power Automate, Power Apps requires all connectors return JSON.


Update the function to return JSON

Step 9. Open the function implementation.

Click on the files view of Visual Studio code and the SydneyWeather function

Step 10. Edit the function to return JSON

Comment out the template return and paste in the following code:

            //return new OkObjectResult(responseMessage);

    var myObj = new {name = “Mostly Sunny”location = “Sydney”};

    var jsonToReturn = JsonConvert.SerializeObject(myObj);

            return new OkObjectResult(jsonToReturn);


This should look like the following image:

Step 11. Checkin this update.

Step 12. Deploy these updates:


Step 13. Run the function interactively from the Browser.

Running the updated browser will now return the following JSON

Congratulations your Azure function is now ready to be called by Power Apps!

Creating a Power Platform custom connector to reference the Azure Function.

Step 14. Log in to Power Apps.


Step 15. Navigate to the Custom Connectors hub under the Data Menu


Step 16. Select New Custom Connector > From Blank Template

Supply a connection name….in this case I simply called it “SydneyWeather”.

Step 17. Supply the initial Connector information.


Step 18. Continue Past the security section (no need to set anything)

Step 19. Set the general definition data

Step 20. Create the request via “Import from Sample”

Change the verb to “Get” and past in your function URL….From the example above it would be:

Then click impot



Step 21. Create the response from example body


Using the browser run response, paste in the response to the body and select import. In the example above it would be:

        {"name":"Mostly Sunny","location":"Sydney"}


Step 22. Proceed to Test

Step 23. Create the Custom Connector and a connection to it.

Step 24. Test the operation.


Using the connector from Power Apps

Step 25 Create a new Canvas Application.

Step 26. Navigate to the Connectors.

Step 27. Select our custom connector.



Step 28. Use the custom connector to invoke the Azure function.

Add a button and set the text of the button to the custom connector name, namespace and function name.

In this example it would be:



Congratulations you have created an Azure function, a custom connector and called them from Power Apps!

Enjoy a tropical vacation at Paradise Blue

Situated along the continental US’s only tropical reef at Little Torch Key, Paradise Blue is a diver or fisherman’s dream vacation location.

For scuba divers Paradise blue is located near Looe Key Florida Marine sanctuary

Paradise Blue Features:





Florida Keys holds more fishing world records than any other location and fishing from Paradise Blue it is easy to see why.

Winter is an inshore fisherman’s dream with Speckled Trout, Mackerel Kingfish and grouper in the inshore reefs’s,

Spring is the beginning of the Tarpon and Permit seasons with the offshore Black Fin Tuna very willing to visit

Summer is when the offshore game fish and Flats for Bonefish and Tarpon really light up!

All year Yellow Fin, Mutton and Mangrove snapper await your angling pleasure!


Scuba Diving

Looe Key is a coral reef located within the Florida Keys National Marine Sanctuary. It lies to the south of Little Torch and Big Pine Key. This reef is within a Sanctuary Preservation Area (SPA). Part of Looe Key is designated as “Research Only,” an area which protects some of the patch reefs landward of the main reef.











The reef is named after HMS Looe, which ran aground on the reef and sank in 1744.

Elk Horn Coral at Looe Key

Sun Bathing

Paradise Blue offers its residents a private pool, hot tub and sun bathing station on the canal for any sun worshipper looking to achieve a golden hue!


Paradise Blue boast a tiki bar, cards, board games, Kayak, Stand up Paddle Board, barbecue, a covered sitting lounge, a large screened porch, a large screen TV, Xbox One S with over 50 games, an ice maker capable of making 100 lbs of ice per day residents should never want for things to do.


Nearby Amenities

With three bars, several restaurants, two tackle shops and a large grocery store with in 5 miles convinces are never far away.


Little Torch Key is an island in the lower Florida Keys.

Situated along U.S. Route 1 (also known as the Overseas Highway), crosses the key at about mile markers 28—29. It is immediately preceded to the northeast by Big Pine Key, and is followed by Middle Torch Key to the southwest.[3]

A small island 24 miles (39 km) from Key West, Little Torch Key is home primarily to locals, living and working from Big Pine Key to Key West. The island is also host to visitors who don’t mind a commute to the popular destination of Key West. There are a few, but not many businesses on the island, including restaurants and lodging.

Like all of the keys in the Torch Keys, this key was probably named for the native torchwood tree, Amyris elemifera L. The north end of the key is the site of a former settlement which was abandoned in 1938 when the highway was relocated.

Its most likely claim to fame is as a relatively frequent fishing destination for U.S. President Harry S. Truman. A Reuters story on February 14, 2009, named a resort there as one of the “Top 10 most romantic retreats”.


For booking Paradise Blue please visit:

Power Platform Developer Training

Application Introduction: End to end employee hardware ordering solution.  

This solution simulates how a large organization might supply their employee’s hardware needs. This offering consists of a secure application that enables comparing and ordering company confidential configurations and pricing and can run on mobile devices yet offers support that takes advantage of global/public facing locations and knowledge bases. This training will focus on automating the build and deployment of this application.

The business requirements for the application are:


  1. Securely displays only organization IT approved devices and prices.
  2. Runs on both web and mobile devices.
  3. Friction free and intuitive device identification.
  4. Streamlined order and approval process.
  5. Built by the internal IT group who does not have traditional development resources.
  6. Can be integrated and hand off to the company’s governance and updating process as to enable GDPR, Compliance and Accessibility.


Technologies utilized to achieve the business requirements.

  • Dataverse: Make it easier to bring your data together and quickly create powerful apps using a compliant and scalable data service and app platform that’s integrated into PowerApps.
  • PowerApps: A software as a service application platform that enables power users in line of business roles to easily build and deploy custom business apps. You will learn how to build both Canvas and Model-driven style of apps.
  • Power Automate to enable loosely coupling business logic from presentation tiers.
  • Azure DevOps for the automation of updates and deployment
  • Extending the application using Power Apps Component Framework controls



  1. Create a new environment with a Dataverse Database.
    1. Use the solution to create the entity, bring in components.
    2. Import configuration Data from web-based data source and creating entities into Data.
  2. Create a Power Apps canvas-based application with components.
  3. Exporting Solution from Power Apps environment to Source Code
  4. Deploying the solution from Source Code into a new environment
  5. Using Environment variables in the deployment pipeline to update the Solution.
  6. Adding a PCF Control (Greg Hurlman)


Pre-requisites: Before starting the hands-on lab

Task 1: Download the Power Apps Solution file

Setting up the environment and data

This application will be created in a dedicated environment that enables architectural, security and organization separation and geographic specificity.


This hand-on lab will give you an opportunity to get hands on with the best practices to get your app into source control, generate a managed solution from Source (your build artifact) and finally deploy the app into another environment. You will need access to 3 Dataverse environments (Development, Build & Production) along with Azure DevOps to automate deployments.

Lab Setup:

You will need to create three environments in your demo or customer tenant. To do this follow these instructions:

  1. Login to a tenant that you have access to and that minimum 3GB available capacity which is required to create 3 environments.
  2. Go to , this will take you to the admin center
  3. Select Environments in the navigation area

  4. Select “+ New Environment” to create your first new environment.

  5. The first environment should be named Your Name – dev”, set the region to “United States (default)”, set the Environment type to “Production” (if available) if not use “Trial”

  6. Select “Create environment”
  7. Now that your environment has been created select “Create database.”

  8. Set the Currency to “USD” and Language to “English”, Include the sample apps and data then select “Create database”

  9. Your development environment has been created, follow steps 4 – 8 above to create a second environment called Your Name – test”.

  10. Now you have the environments that we will need for this lab


ALM Hands-on Lab Overview:

During this lab you will use the account, login and environments you created in the previous steps. You will get hands-on with the full set of Application Lifecycle Management (ALM) capabilities for the Power Platform. Learn how to use the key concepts that allow customers and partners to manage solutions and deploy them across environments. Additionally, get up to speed with the details of canvas apps and flows in solutions.


  1. Let’s get familiar with the environments that you created (in the pre-lab).
  2. Go to to view the environments you have access to
  3. You will see your environments, one with dev in the name and one with test in the name.

  4. We will use these environments as our primary environments during the lab. Your user will have the “System Administrator” Dataverse role for all environments giving you full access to the environment.
  5. In a new tab open

  6. In the header, you will see a way to switch environments to move between your different environments.

  7. When you change to a new environment will change to show only content relevant to that environment. Both environments contain a Dataverse database. This allows you to leverage the Dataverse solution infrastructure to move app, tables, code and other resources from one environment to another.
  8. Select “Solutions” in the navigation

  9. Solutions are how customizers and developers author, package, and maintain units of software that extend Dataverse. Customizers and developers distribute solutions so that organizations can use Dataverse to install and uninstall the app(s) defined in the solution. In the next Module you will learn how to create a new solution



Importing the App in a day Solution.

To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here:

  1. Navigate to and download this file to your hard drive then select Import in the Solution home to bring this solution into your Power Apps environment.
  2. To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here:


Note: this may take a couple of minutes

  1. Publish customizations to push the Device Order custom entity definition into this environment.

Note: importing the solution will now have you create new connections in this environment for ones that were in the solution.

This process also populates environment variables in the solution.

Bringing in the Device and Manufacture Application Data

These steps will populate the Dataverse Tables brought in with the solution import using PowerShell. Note these same steps can be done manually using the DataMigrationUtility tool.

Installing the needed Power shell Module



Install-Module -Name Microsoft.Xrm.Tooling.PackageDeployment.Powershell -RequiredVersion


After prompted to run and a successful install you should be run the following


$cred = Get-Credential

$crmConn = Get-CrmConnection -OrganizationName chass-Dev -OnLineType Office365 -Credential $cred

import-CrmDataFile -CrmConnection $crmConn -Datafile “c:\alm\” -Verbose

(note this will take a couple of minutes)


  1. Navigate to the solutions view and open the Device Order Solution

Congratulations you have earned the badge: “Importing Dataverse data”


The Device Ordering Power Apps Canvas App

Do not proceed before going through the lab pre-requisite steps

This lab will create new Device ordering application to replace an existing paper and email-based system.

Power Apps Canvas Studio Layout

Power Apps Canvas Studio is available as a web application ( that you can use in any modern browser.

Power Apps Studio is designed to have a user interface familiar to users of the Office suite. It has three panes and a ribbon that make app creation feel like building a slide deck in PowerPoint. Formulas are entered within a function bar that is like Excel. Studio components:

  1. Left navigation bar, which shows all the screens, data sources, and controls in your app
  2. Middle pane, which contains the app screen you are working on
  3. Right-hand pane, where you configure properties for controls, bind to data, create rules, and set additional advanced settings
  4. Property drop-down list, where you select the property for the selected control that you want to configure
  5. Formula bar, where you add formulas (like in Excel) that define the behavior of a selected control
  6. Ribbon, where you perform common actions including customizing design elements
  7. Additional items, here you will find your environment selection, app checker, and the preview app functionality.


Add an order to Device Order Common Data entity using a form.

The solution imported was a read only view, in this exercise we are going to update it to take orders and write them into the Orders table.

  1. Open the Device Ordering Application from the solutions view.


  1. Click on Insert > Forms and select Edit

  1. Resize the edit form to the bottom of the application.

  1. Set the Data Source of the form to the “Device Orders” entity

  1. Remove the field “Created On”

  1. Add the fields to Name, Price, Request by, Requested Date

  1. Set the form to have two columns

  1. To set the properties of the name edit form; select the Name text box and the Advanced properties. Select “Unlock to change properties”

  1. Scroll down to Default property and type:

‘Device Gallery’.Selected.’Device Name’

  1. Unlock the “Requested By” property

  1. Set the “Requested By” property to


  1. Unlock the “Price” property

  1. Set the “Price” default property to:

Text(‘Device Gallery’.Selected.Price,“$.00”)


  1. Rename our edit form from Form1 to “OrderForm”


  1. Add an Icon for our save. Click on Icons menu and select “Check” and move to the bottom right of the screen as indicated below.

  1. To write the order details to the Dataverse entity: Select the Check icon and in the OnSelect type:


NOTE: If you try running your application now, you will find your order form disappears!!!!!!!!!!!!!!!…Let’s “fix” this!!

  1. To display the edit forum. Select the Device Gallery chose the “OnSelect” property and type:



  1. Click File and select Save.
  2. Click the back arrow.

Congratulations badge: “Writing Data to practically any data source!”


Using Environment Variables

The device Ordering application offers a link for customer to request support. While in the development phase the team doesn’t want to send test requests to the Support Team but they also don’t want to make application changes for their production deployment-which is what they are doing now.

This section will walk through updating the application use environment variables solves this problem.

For more information about using environment variables directly in Canvas applications see this great community article:

Working with Environment Variables in Canvas Power Apps and Power Automate Flows | The CRM Chap

Done and looking for something else to do?

  • Apply a theme to your Application for a new look at feel.
  • Add navigation to the header component.
  • Give the controls a responsive layout
  • Create your own component for the form submission.

Create an Azure DevOps Project

In this section of the hands-on lab, you will create an Azure DevOps project, setup permissions and create the required pipelines to automatically export your app (as an unmanaged solution) from a development environment, generate a build artifact (managed solution) and finally deploy the app into production. The lab will also introduce you to Microsoft Power Platform Build Tools.

We are going to use Azure DevOps for both source code storage and build and deployment automation. You can use any automated source control and build automation tools using the same principles. Your configuration data should be exported from a development environment, processed by Package Deployer and checked into source control.

  1. Log into with your credentials and click on your organization in the left panel. Follow the instructions to create a project. Click Create Project.

For the Pilot the instructor will give you these credentials i.e. using the organization

  1. Create a DevOps project by going here and selecting “Sign into Azure DevOps”

    For the Pilot we will be using the organization:


  2. Select “+ Create project”

  3. Create a Project name called “Your Name – DevOps Project”, make it a Private project and select “Create”

  4. Your DevOps project has now been created, please note the URL on the top for your project and bookmark it or save it locally

  5. You are now ready to begin the ALM Hand-on Lab


  1. When the project is completed, you will need to create a Repo to hold the source code. Click on the Repos link in the left navigation.

  1. Initialize the repo with the default README by clicking the Initialize button.

Below is a screenshot of the repo post initialization.

Install and Enable the Azure DevOps Extensions for Power Platform

In the pilot class we are using a Beta of the Power Platform Build Tools. To enable this, we are all sharing one Azure DevOps Organization and this section has been done for you.

Soon you will be doing this in your own organizations using the Azure DevOps Market Place.

  1. Navigate to the organization page, click on the Azure DevOps links then the Organization Settings link in the bottom left of the navigation panel


  2. On the organization settings page, click the Extensions link in the left navigation panel


  3. For this lab we will be using a beta build of the Power Platform Build tools to enable the use of environment variables. This beta extension has already been shared with this organization. And can be found by going to the shared Tab.

  1. Select Install

  2. Select “Go to Markeplace”



AFTER THE EVENT YOU will find the Power Platform Build Tools in the Azure DevOps marketplace.

In addition to creating build pipelines, we will be updating environment variables and will be using the “Replace Tokens” extension, this too needs to be installed.

  1. Click ‘Browse marketplace. This will redirect you to the Azure DevOps marketplace. Search for “Replace Tokens” and select the Replace Tokens extension. Fun side note this widely used and popular extension was written and maintained by a Microsoft MVP.

  1. Click ‘Get it free’.

  1. Click ‘Install’
  2. Click ‘Proceed to organization’

Configure Azure DevOps Permissions for Build Service Account

The build pipelines that we set up later will be exporting files from an org and checking them into your source code repo. This is not a default permission of Azure DevOps, so we need to configure the appropriate permissions for the pipeline to function properly.

  1. Navigate back to the project created earlier:


  1. Click the Project Settings icon on the lower left of the screen and click the Repositories link in the fly out menu.

  1. Navigate to the Repositories Menu

  1. Type in Project Collection Build Service in the search box and select it (select the project collection with the username appended)

  1. The setting for the user is displayed. Select Allow for ‘Contribute’ and ensure that the green checkbox is displayed.

Create a Service Connection

These build pipelines will be importing and exporting solutions from environments that could potentially span tenants. Service Connections is where the credentials are stored for this access. This section will walk through the Service Connection creation.

  1. Under Project settings select “Service Connections”

  1. Select: “Create service connection” Button


  1. Select Generic. Note: For production pipelines it is recommended that you use the Power Platform Service Connection for more secure scenarios like Multi Factor Authentication (MFA)

  1. Specify the Server URL for the environment where we imported and setup our solution.

  1. Retrieve the Server URL in the Power Platform Administration Center:

  1. Supply the credentials for your tenants as supplied by the instructor.

  1. Save the Service Connection

Build Pipeline 1: Create Export from Dev

The first pipeline you will create will export your solution from your development environment as an unmanaged solution, unpack it and check it into source control (your repo)

  1. Start with YAML file



  1. Click the Create Pipeline button.

  1. Select “Azure Repos Git” for “Where is your code”?

  1. Select our Hello DevOps repo created above.

  1. Select the Existing Azure Pipelines YAML, file

  1. Select the “Export-From-Dev.YML checked in above.

  1. Save and run your newly created pipeline.

NOTE: If you see the following errors, you may have missed the step of adding the Power Platform Build Tools BETA above.

Congratulations you have successfully exported your solution from a Development environment to an Azure DevOps repo!

  1. You can validate the what the Build did by looking at the repos

Automate the configuration and data import into the Solution

As when we manually imported the solution it comes in with out the state data needed for the application. In this set of steps we are going to automate the importation of the application data.

If you didn’t manually import the data above you will need to download the file here: You can find that here:

  1. Under Repos > Files > Open Device Oder and select the vertical ellipsis so you can select New > File.

For the file name call it:


Note: The file name should have automatically added the “DeviceOrder” folder name in the dialog.


  1. Commit this file to the repo

  1. Add all the files from to this folder

  1. Drag all three files to the: “Drag and drop files here” area and commit the three files. (The reason we didn’t just create a folder above is Git doesn’t support empty folders)

  1. Commit those file additions/updates.

Deployment Settings

  1. Deployments settings are contained in the file deploymentSettings.JSON which resides in the DeviceOrder folder.

To add this file select the vertical ellipsis to the right of the SolutionPackage folder, select upload file and upload this file:

  1. Commit this file addition.

Create the Deploy to Test Pipeline

Like above we will be creating the pipeline that deploys to the Test environment from an existing file.

  1. Download the pipeline definition file from here:
  2. Upload this pipeline definition into the repo. NOTE: This file needs to being the root of the project Repo


  1. Upload the build-and-deploy-to-test.yml file to the project repo

  1. Create a new pipeline by selecting the Pipelines Flyout menu.


  1. Create the pipeline based on Azure Git Repo

  1. Create the pipeline based on the file build-and-deploy-to-test.yml file in our project Repo

  1. Select use Existing Azure Pipelines YAML file option

  1. Select the build-and-deploy-to-test.yml file and choose “Continue.”

  1. Save pipeline definition.



Setting up the variables for the Test Deployment Pipeline

The deploy to test pipeline definition expects a username, URL and password for the environment being deployed into. This section will set the values for those variables.

  1. Select the variables button.

  1. Select New Variable

  1. Create a variable for the Username using the settings supplied by the instructor…or the environment you want to deploy to.

  1. Add another variable.

  1. Name it Password and set the value to the credentials supplied for the Test environment.

  1. Create another variable name “Url” set the value to your Test environment.

Creating the Test Service Connection

As the Test Deployment pipeline will be deploying into another environment and potentially another tenant a new Service connection is needed to maintain the connection credentials.


  1. Under Pipelines select “Service Connections” then the button “New Service Connection”

  1. Set the values of the service connection for the credentials and Server URL to the Test (Destination) Environment.



Making the pipelines dynamic through Environment Variables

A common development scenario is to stub calls to production services during the development process as to not flood those services with test calls. This process is often called Fakes. In the Device Order application, we don’t want to send the support team nonproduction support requests, so we made the email address to the support team an environment variable. In this solution we have two artifacts we want to dynamically update, the reference connection emails is being sent with and the email address you can see this variable defined as: new_SupportEmailAddress. This section will show how to have build pipelines specify these settings as they progress from Dev to Test to Production….without changing the underlying application.

This screenshot shows the environment variable in the solution

This screenshot shows the environment variable properties in the solution.


In our project these settings are held in the deploymentSettings.JSON file.

Contents of deploymentSettings.JSON


“ConnectionReferences”: [


“LogicalName”: “new_sharedoffice365_1812f”,

“ConnectionId”: “e5c5fd57c8c845dc8fd1d063a89269eb”,

“ConnectorId”: “/providers/Microsoft.PowerApps/apis/shared_office365”



“EnvironmentVariables”: [


“SchemaName”: “new_SupportEmailAddress”,

“Value”: “”





To make the properties dynamic we need to tokenize the deployment settings file and add a replace-tokens task to our pipeline

  1. Tokenize the deploymentSettings.JSON by adding token characters to the setting values. The replace token extension will then set these values in the pipeline at runtime.


“ConnectionReferences”: [


“LogicalName”: “new_sharedoffice365_1812f”,

“ConnectionId”: “#{new_sharedoffice365_1812f}#“,

“ConnectorId”: “/providers/Microsoft.PowerApps/apis/shared_office365”



“EnvironmentVariables”: [


“SchemaName”: “new_SupportEmailAddress”,

“Value”: “#{new_SupportEmailAddress}#




The next step would be to set the Replace Token Extension by editing the Build Pipleline (This has been done for you)

Then set the properties of the replace tokens extension so we are looking in the solution folder and only in the deploymentSettings.JSON file. (This has been done for you)

We then need to create variables to supply the values for the ConnectionId and new_SupportEmailAddress

  1. Set the ConnectionID Variable Properties to: name new_sharedoffice365_1812f and value of e5c5fd57c8c845dc8fd1d063a89269eb

  1. Create the new_SupportEmailAddress variable and the properties name: new_SupportEmailAddress and a value of some email address.

  1. Run the Build-and-deploy-to-test Pipeline


Note: If your build gives the error:

“Could not load file or assembly ‘System.Management.Automation.resources”

This is a generic error in the Beta extension for ALL errors. To see what is causing the build error you will need to enable diagnostics to investigate the real cause. Common Causes is missing variables and attempting to deploy managed solutions into environments with unmanaged versions of that solution already imported. For instance, in the diagnostics, you can see the output:

##[debug]Message: Solution manifest import: FAILURE: The solution is already installed on this system as an unmanaged solution and the package supplied is attempting to install it in managed mode. Import can only update solutions when the modes match. Uninstall the current solution and try again.Detail:






For final confirmation, log into your production system and see your application!
<Greg Hurlman to add PCF section>



© 2019 Microsoft Corporation. All rights reserved.

Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein.

Microsoft and the trademarks listed at are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.

© 2019 Microsoft Corporation. All rights reserved.


Bringing in Power Apps application Data in your ALM work flows


The suggested best practice to backing up and recovering or ALM deployments of Power Apps is to use solutions.

Unfortunately exporting and importing a solution doesn’t bring the application data with the entity(Table) definitions.

I have been playing with the ALM workflows and wanted to share how you can automate data retrieval when importing your solutions.

The first step is to use Power Apps solutions for your application.

The next step is to extract the schema and data that your application relies on to run. (If you don’t have a solution that uses data skip this step)

While you can also automate this (see below the Power shell automation process) I did it manually with the DataMigrationUtility that comes with the

Download tools from NuGet (Developer Guide for Dynamics 365 Customer Engagement) | Microsoft Docs

(DataMigrationUtility tool)

The next step is to import your solution into a new environment. If you don’t have a solution to play with I have created a solution a for the App in a Day training that contain tables for the Devices and Manufactures.

You can find that here: <link>

But like I mentioned above the application is imported with out the data it needs to work. To populate those entities (Tables) we could do it manually using the DatamigrationUIlity Tool like I did to export the data but since we will likely want to automate this let’s use Pwower shell:


$cred = Get-Credential

$crmConn = Get-CrmConnection -OrganizationName ContosoEnviroment -OnLineType Office365 -Credential $cred

Import-CrmDataFile -CrmConnection app -Datafile “” -Verbose







Finding and using the Power Apps Sample Templates

In addition to the over 500 community samples a lot of people don’t realize the product comes with a plethora (in case you ever wondered how many is a “Plethora”: in this case it is 34<g>) of amazing samples and templates.

To use the built-in samples and templates, log into

Then from the home screen select “All Templates”.

Note: This the same selecting “Create” on the left most navigation.

Scroll about 1/3 of the way down and you will find the built in application templates.

If you are looking for a suggestion/starting point the training template is hard to beat!

Note: this does not include the Project Oakdale templates!

Those are even easier to locate and use as they are called out in the documentation here!


Remember if you have need training on Power Apps you can find that here:

Getting Started with Power Apps from Bare Metal

In the case of wanting to learn Power Apps or teach a class on Power Apps it is very likely you will need a clean environment in which to start.

This document will walk through a solution for creating a new trial Microsoft Office tenant, starting a Power Apps trial and creating a Trial environment for you to either go through training content self paced or deliver that content.


  1. To get started with most Power Apps training you will need an organizational ID and a Tenant that will allow you to create an environment. One of the easiest ways to get a new Organizational ID and Tenant you have permissions to is create an Office Developer Tenant. To this Navigate to: and click JOIN NOW

  1. Fill out the requested information about yourself

  1. Fill out the requested information about your interests

  1. Close the Introduction Dialog

  1. Now that you have a Developer Account we can create an Office trial Tenant…with full E5 services!

  1. Fill out the information about yourself. Note this information is used to create the Tenant and information you will be using regularly.

  1. The activation requires an MFA handshake…supply this information for your own phone.

  1. TADA You know have an E5 Office Tenant for 90 days!

  1. Inside this same browser session Navigate to and select “Try Free”. Either from the middle of the page or the menu at the top right.


  1. Login using the credentials you supplied in step 6 above.


  1. Supply Geography information for the Power Apps trial

  1. Accept the introduction dialog.



  1. At this point you are now ready for any of the self-paced training found at:


Note: While this Tenant came with 25 users this tenant did not come with any Common Data Service capacity. So while the first user will be able to create a trial environment, additional users will need to use that same environment.



Power Platform Community Experts (Mockup Only)

The Power Platform has an amazing community that make themselves available day or night and go out of their way to help their fellow App Makers using the Power Platform.

The Power Platform Community Experts is a recognition program focused on recognizing the most proficient and active in their respective fields and geography. While the awards are called for a particular area be assured these community experts span both technology and interaction experiences.

The leaders have been selected by a committee of Microsoft Product Engineering managers, Power Platform executive leadership and community managers.



While the area of Oceania may have fewer people than areas like Asia, the Americas and Europe its impact is amazing and this section it to recognize those Community Experts from this amazingly active are of the planet.

Dynamics365 Instructor Lead Training

Lisa Crosbie

Delivering a staggering 50 Application in a Day and Dynamics365 classes per year it is a wonder Lisa has any time to do anything else but that is not the case she is regular all the Power Platform Community events, User Groups even has time to help Microsoft update the Power Virtual Agents in a day training. In Lisa’s own words: “I am genuinely excited about what low code application development means for organisations, and how it can transform the lives and work of so many people. I work as an evangelist for Power Platform and Dynamics 365, which means I get to spend my day talking to people about this awesome technology, what the possibilities are, and how it can help them in their organisations. I am a blogger, trainer, speaker, podcaster and YouTuber. You’ll find me presenting at and participating in all kinds of virtual events, training sessions and webinars, big and small – wherever there are people who want to learn about Power Platform.”


Dynamics365 User Group Delivery

Amey Holden

Delivering a session at one User Group is to be respected, doing User Group sessions at country level is to be applauded; Amey has taken this an entire new level by not just regularly delivering sessions across globe but has created and championed an application that bring those members together. Amey is a proud Microsoft Most Valuable Professional and dedicated ABC (Anything But Code) enthusiast. I am passionate about extending and improving Dynamics Customer Engagement & Marketing with the mighty Power Platform.

Power BI Book Author

Reza Rad

In 2019 Reza Rad was on the road delivering community events an amazing 4 months flying over 120K miles somehow during this time Reza has managed to create an encyclopedia of Power BI information that he makes freely available to the community called: From Rookie to Rock Star. About Reza Rad is an author, trainer, speaker, and consultant. He has been a Microsoft Data Platform MVP for six years, specializing in Microsoft Business Intelligence and data movement. Reza has worked with Microsoft BI technologies more than 15 years. He is an MCP, an MCT, and co-leader of the New Zealand Business Intelligence Users Group.

Reza is author of the books SQL Server Integration ServicesMicrosoft SQL Server 2014/16 BI, and the Power BI online book From Rookie to Rock Star and is author of Channel 9 SSIS tutorial video series. You can find his technical articles on his blog at .


Power Automate Video Content

Elaiza Benitez

Elaiza Benitez is a Senior Consultant at Theta. Elaiza started her career as a Sales administrator in 2009 before progressing to the consultancy path both in New Zealand and in Australia where she lived for 6+ years before travelling the world and meeting the Microsoft community. She has worked with Dynamics 365 since version 4 and is competent in Power Apps Portals and Power Automate.

In 2018 she was awarded with the ARN Women In ICT – Technology for Australia. The Technical award recognizes the candidate who has excelled in the technical and engineering segment of the ICT industry, demonstrating a proven depth of knowledge and abilities. The candidate has demonstrated excellence in problem-solving and decision-making skills, and an exemplary level of accomplishment in job performance.

Elaiza Benitez is an international speaker, a YouTuber known for her What the Flow Series, a blogger and an advocate in the global Microsoft community.

AI Builder Conferences

Leila Etaati

Leila is Data Scientist, PhD, MVP, and BI Consultant, and Speaker. She has over 10 years’ experience working with databases and software systems. She was involved in many large-scale projects for big sized companies. Leila has PhD of Information System department, University of Auckland, MS and BS in computer science. She worked in Industries including banking financial, power and utility, manufacturing … She is a lecturer and trainer in Business intelligence and data base design course in University of Auckland. Leila speaks in international SQL Server and BI conferences such as Microsoft Ignite, PASS Summit, PASS Business Analytics, PASS Rally, and many SQL Saturdays in USA, Europe, Australia, and New Zealand on Machine Learning and Analytics topics.


<in process>


<in process>


Power Apps Video Content

Shane Young

With over 150 videos it is little surprise Shane has been recognized as leader of top Power Apps Video content. Shane has been a Microsoft MVP for the last 14 years. This is a direct reflection of his love of all things community. His favorite technologies right now are PowerApps and Flow. Power to the people. HA! Speaking, writing, and answering questions in forums and on Twitter are all things that drive him. On Twitter, you can find him @ShanesCows or if you like to learn then check out his YouTube channel for the best PowerApps learning videos around.

Power Apps Instructor Lead Training

Reza Dorrani

Reza is a Microsoft Business Applications MVP & Principal consultant at Catapult Systems.   He is a Microsoft Power Apps & Microsoft Power Automate community dual super user.  He was awarded the Microsoft Flow All Star award by the community.  He is also the founder and leader of the Houston Power Apps and Power Automate User Group


Power BI Book Author

Ken Puls

Ken’s book is the number #selling book in the world on the M Language shared by Excel, Power BI, Azure Data Factory and the rest of the Power Platform family.

Chartered Professional Accountant (FCPA, FCMA) in Canada, and the president of Excelguru Consulting Inc.  I’m a blogger, author and trainer with over 20 years of business and financial modelling experience. My passion lies in exploring tools to turn data into information, and teaching others how to benefit from them.  I’ve held the Microsoft MVP distinction since 2006, have been recognized as a Fellow of my accounting organization and as one of the “Top 20 under 40” business & community leaders on Vancouver Island and currently leading the PowerBI Usergroup in sunny Vancouver Canada.  My website can be found at, and holds many code samples for working with Excel and other MS Office apps, as well as my blog and a free help forum.



World Leaders





Power Apps Community Call Oct 21st: Charting Components, Custom Apps in Teams and an enhanced SharePoint web part

October brings us a Very special community call as we are going to unveil /announce a new community offering from Canviz.

Canviz has been working on Power Apps charting components for much of 2019 and have graciously decided to give them to the community. This component will include nine of the most commonly requested charting visuals (i.e. Scatter, solid gauge, Radar, Candlestick, funnel, and Gannt!). This month’s community call will then proceed with a walk through of a Power Apps Expense Application built with Oakdale (so fully integrated with Microsoft Teams) by Reza Dorrani. The demos will conclude with April Dunnam and Hugo Bernier walking us through an ENHANCED Power Apps SharePoint web part built by our very own community start and MVP Hugo Bernier!



  • Introductions
  • Announcing Canviz Power Apps Charting Components
  • Project Oakdale My Expenses Power App
  • Power Apps Web Part
  • News and Community Contributions


When: October 21, 2020 8AM Pacific Time


Our Presenters:

April Dunnam

Thank you for visiting my blog! My name is April Dunnam.  I’m a Microsoft Business Applications MVP and lead consultant/co-owner of ThriveFast, a Microsoft Partner located in Tulsa, Oklahoma.  I love using technologies like Office 365, SharePoint, Teams, Azure, Flow and PowerApps to help businesses create a thriving automated, collaborative environment. On top of blogging, I speak at local tech events and run the local PowerApps and Flow User Group.  When I’m not working I love going to karaoke or trivia night at the local brewery.


Hugo Bernier

Hugo is someone organizations call when their Office 365, SharePoint, and Dynamics 365 projects are doomed to fail and need help to get things back on track. As the self-proclaimed “World’s Laziest Developer” and a certified SCRUM Master, he teaches organizations how to successfully deliver their Microsoft 365 engagements while minimizing efforts and increased user adoption. Hugo has worked in Canada, the United States, Germany, Finland, Singapore, Hungary, France, and the United Kingdom. He is an active member of the SharePoint Development Community and has created several PnP reusable controls, property controls, and many sample web parts and extensions. Hugo is also a proud member of the Microsoft 365 Patterns and Practices (PnP) Team. The PnP team is a group of Microsoft employees and MVPs who coordinate the various open-source activities across GitHub and other social media channels, focused on helping the community on how to best use of Microsoft products, like Microsoft Teams, OneDrive, SharePoint or API layer like Microsoft Graph.


Reza Dorrani

Reza is a Microsoft Business Applications MVP & Principal consultant at Catapult Systems.   He is a Microsoft Power Apps & Microsoft Power Automate community dual super user.  He was awarded the Microsoft Flow All Star award by the community.  He is also the founder and leader of the Houston Power Apps and Power Automate User Group


Matt Schuessler


Matt has a passion for Power Apps, is an avid community member, not afraid to speak his mind about politics, loves collaboration technologies and is a Microsoft Power Platform Consultant specializing in Microsoft PowerApps and Power Automate at CANVIZ. At CANVIZ he is definitely a jack of all trades and has a proven track record on all the following the areas…including projects for Microsoft!

• Solution Architecture / Engineering
• Business Process Optimization
• IT Project Management
• Business Analysis
• Cloud Solutions
• Development
• Migration


Todd Baginski

A graduate from the University of Cincinnati Carl H. Lindner College of Business, Todd helps grow the business and leads the technical teams at Canviz. Todd is an 11-time Microsoft MVP with over 20 years of experience in software development. He consistently keeps Canviz on the cutting edge of web, mobile, desktop, and cloud technologies. He leads the technical teams at Canviz with a passion for sharing knowledge and attention to detail. In his free time, Todd gives back to his local community by coaching and growing youth sports teams.