Application Introduction: End to end employee hardware ordering solution.
This solution simulates how a large organization might supply their employee’s hardware needs. This offering consists of a secure application that enables comparing and ordering company confidential configurations and pricing and can run on mobile devices yet offers support that takes advantage of global/public facing locations and knowledge bases. This training will focus on automating the build and deployment of this application.
The business requirements for the application are:
- Securely displays only organization IT approved devices and prices.
- Runs on both web and mobile devices.
- Friction free and intuitive device identification.
- Streamlined order and approval process.
- Built by the internal IT group who does not have traditional development resources.
- Can be integrated and hand off to the company’s governance and updating process as to enable GDPR, Compliance and Accessibility.
Technologies utilized to achieve the business requirements.
- Dataverse: Make it easier to bring your data together and quickly create powerful apps using a compliant and scalable data service and app platform that’s integrated into PowerApps.
- PowerApps: A software as a service application platform that enables power users in line of business roles to easily build and deploy custom business apps. You will learn how to build both Canvas and Model-driven style of apps.
- Power Automate to enable loosely coupling business logic from presentation tiers.
- Azure DevOps for the automation of updates and deployment
- Extending the application using Power Apps Component Framework controls
Create a new environment with a Dataverse Database.
- Use the solution to create the entity, bring in components.
- Import configuration Data from web-based data source and creating entities into Data.
- Create a Power Apps canvas-based application with components.
- Exporting Solution from Power Apps environment to Source Code
- Deploying the solution from Source Code into a new environment
- Using Environment variables in the deployment pipeline to update the Solution.
- Adding a PCF Control (Greg Hurlman)
Pre-requisites: Before starting the hands-on lab
Task 1: Download the Power Apps Solution file
Setting up the environment and data
This application will be created in a dedicated environment that enables architectural, security and organization separation and geographic specificity.
This hand-on lab will give you an opportunity to get hands on with the best practices to get your app into source control, generate a managed solution from Source (your build artifact) and finally deploy the app into another environment. You will need access to 3 Dataverse environments (Development, Build & Production) along with Azure DevOps to automate deployments.
You will need to create three environments in your demo or customer tenant. To do this follow these instructions:
- Login to a tenant that you have access to and that minimum 3GB available capacity which is required to create 3 environments.
- Go to https://admin.powerapps.com , this will take you to the admin center
Select Environments in the navigation area
Select “+ New Environment” to create your first new environment.
The first environment should be named “Your Name – dev”, set the region to “United States (default)”, set the Environment type to “Production” (if available) if not use “Trial”
- Select “Create environment”
Now that your environment has been created select “Create database.”
Set the Currency to “USD” and Language to “English”, Include the sample apps and data then select “Create database”
Your development environment has been created, follow steps 4 – 8 above to create a second environment called “Your Name – test”.
- Now you have the environments that we will need for this lab
ALM Hands-on Lab Overview:
During this lab you will use the account, login and environments you created in the previous steps. You will get hands-on with the full set of Application Lifecycle Management (ALM) capabilities for the Power Platform. Learn how to use the key concepts that allow customers and partners to manage solutions and deploy them across environments. Additionally, get up to speed with the details of canvas apps and flows in solutions.
- Let’s get familiar with the environments that you created (in the pre-lab).
- Go to https://admin.powerapps.com/environments to view the environments you have access to
You will see your environments, one with dev in the name and one with test in the name.
- We will use these environments as our primary environments during the lab. Your user will have the “System Administrator” Dataverse role for all environments giving you full access to the environment.
In a new tab open https://make.powerapps.com
In the header, you will see a way to switch environments to move between your different environments.
- When you change to a new environment PowerApps.com will change to show only content relevant to that environment. Both environments contain a Dataverse database. This allows you to leverage the Dataverse solution infrastructure to move app, tables, code and other resources from one environment to another.
Select “Solutions” in the navigation
- Solutions are how customizers and developers author, package, and maintain units of software that extend Dataverse. Customizers and developers distribute solutions so that organizations can use Dataverse to install and uninstall the app(s) defined in the solution. In the next Module you will learn how to create a new solution
Importing the App in a day Solution.
To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here: https://aka.ms/AIADSolution
- Navigate to https://aka.ms/AIADSolution and download this file to your hard drive then select Import in the Solution home to bring this solution into your Power Apps environment.
- To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here: https://aka.ms/AIADSolution
Note: this may take a couple of minutes
- Publish customizations to push the Device Order custom entity definition into this environment.
Note: importing the solution will now have you create new connections in this environment for ones that were in the solution.
This process also populates environment variables in the solution.
Bringing in the Device and Manufacture Application Data
These steps will populate the Dataverse Tables brought in with the solution import using PowerShell. Note these same steps can be done manually using the DataMigrationUtility tool.
- You can find that here: https://aka.ms/data.zip
Installing the needed Power shell Module
Install-Module -Name Microsoft.Xrm.Tooling.PackageDeployment.Powershell -RequiredVersion 18.104.22.1683
After prompted to run and a successful install you should be run the following
$cred = Get-Credential
$crmConn = Get-CrmConnection -OrganizationName chass-Dev -OnLineType Office365 -Credential $cred
import-CrmDataFile -CrmConnection $crmConn -Datafile “c:\alm\Data.zip” -Verbose
(note this will take a couple of minutes)
- Navigate to the solutions view and open the Device Order Solution
Congratulations you have earned the badge: “Importing Dataverse data”
The Device Ordering Power Apps Canvas App
Do not proceed before going through the lab pre-requisite steps
This lab will create new Device ordering application to replace an existing paper and email-based system.
Power Apps Canvas Studio Layout
Power Apps Canvas Studio is available as a web application (http://make.powerapps.com) that you can use in any modern browser.
Power Apps Studio is designed to have a user interface familiar to users of the Office suite. It has three panes and a ribbon that make app creation feel like building a slide deck in PowerPoint. Formulas are entered within a function bar that is like Excel. Studio components:
Left navigation bar, which shows all the screens, data sources, and controls in your app
Middle pane, which contains the app screen you are working on
Right-hand pane, where you configure properties for controls, bind to data, create rules, and set additional advanced settings
Property drop-down list, where you select the property for the selected control that you want to configure
Formula bar, where you add formulas (like in Excel) that define the behavior of a selected control
Ribbon, where you perform common actions including customizing design elements
Additional items, here you will find your environment selection, app checker, and the preview app functionality.
Add an order to Device Order Common Data entity using a form.
The solution imported was a read only view, in this exercise we are going to update it to take orders and write them into the Orders table.
- Open the Device Ordering Application from the solutions view.
- Click on Insert > Forms and select Edit
- Resize the edit form to the bottom of the application.
- Set the Data Source of the form to the “Device Orders” entity
- Remove the field “Created On”
- Add the fields to Name, Price, Request by, Requested Date
- Set the form to have two columns
- To set the properties of the name edit form; select the Name text box and the Advanced properties. Select “Unlock to change properties”
- Scroll down to Default property and type:
‘Device Gallery’.Selected.’Device Name’
- Unlock the “Requested By” property
Set the “Requested By” property to
- Unlock the “Price” property
- Set the “Price” default property to:
- Rename our edit form from Form1 to “OrderForm”
- Add an Icon for our save. Click on Icons menu and select “Check” and move to the bottom right of the screen as indicated below.
To write the order details to the Dataverse entity: Select the Check icon and in the OnSelect type:
NOTE: If you try running your application now, you will find your order form disappears!!!!!!!!!!!!!!!…Let’s “fix” this!!
- To display the edit forum. Select the Device Gallery chose the “OnSelect” property and type:
- Click File and select Save.
- Click the back arrow.
Congratulations badge: “Writing Data to practically any data source!”
Using Environment Variables
The device Ordering application offers a link for customer to request support. While in the development phase the team doesn’t want to send test requests to the Support Team but they also don’t want to make application changes for their production deployment-which is what they are doing now.
This section will walk through updating the application use environment variables solves this problem.
For more information about using environment variables directly in Canvas applications see this great community article:
Done and looking for something else to do?
- Apply a theme to your Application for a new look at feel.
- Add navigation to the header component.
- Give the controls a responsive layout
- Create your own component for the form submission.
Create an Azure DevOps Project
In this section of the hands-on lab, you will create an Azure DevOps project, setup permissions and create the required pipelines to automatically export your app (as an unmanaged solution) from a development environment, generate a build artifact (managed solution) and finally deploy the app into production. The lab will also introduce you to Microsoft Power Platform Build Tools.
We are going to use Azure DevOps for both source code storage and build and deployment automation. You can use any automated source control and build automation tools using the same principles. Your configuration data should be exported from a development environment, processed by Package Deployer and checked into source control.
- Log into dev.azure.com with your credentials and click on your organization in the left panel. Follow the instructions to create a project. Click Create Project.
Create a DevOps project by going here https://dev.azure.com and selecting “Sign into Azure DevOps”
For the Pilot we will be using the organization: https://dev.azure.com/PPDevelopment
Select “+ Create project”
Create a Project name called “Your Name – DevOps Project”, make it a Private project and select “Create”
Your DevOps project has now been created, please note the URL on the top for your project and bookmark it or save it locally
- You are now ready to begin the ALM Hand-on Lab
- When the project is completed, you will need to create a Repo to hold the source code. Click on the Repos link in the left navigation.
- Initialize the repo with the default README by clicking the Initialize button.
Below is a screenshot of the repo post initialization.
Install and Enable the Azure DevOps Extensions for Power Platform
In the pilot class we are using a Beta of the Power Platform Build Tools. To enable this, we are all sharing one Azure DevOps Organization and this section has been done for you.
Soon you will be doing this in your own organizations using the Azure DevOps Market Place.
Navigate to the organization page, click on the Azure DevOps links then the Organization Settings link in the bottom left of the navigation panel
On the organization settings page, click the Extensions link in the left navigation panel
- For this lab we will be using a beta build of the Power Platform Build tools to enable the use of environment variables. This beta extension has already been shared with this organization. And can be found by going to the shared Tab.
- Select “Go to Markeplace”
AFTER THE EVENT YOU will find the Power Platform Build Tools in the Azure DevOps marketplace.
In addition to creating build pipelines, we will be updating environment variables and will be using the “Replace Tokens” extension, this too needs to be installed.
- Click ‘Browse marketplace. This will redirect you to the Azure DevOps marketplace. Search for “Replace Tokens” and select the Replace Tokens extension. Fun side note this widely used and popular extension was written and maintained by a Microsoft MVP.
- Click ‘Get it free’.
- Click ‘Install’
- Click ‘Proceed to organization’
Configure Azure DevOps Permissions for Build Service Account
The build pipelines that we set up later will be exporting files from an org and checking them into your source code repo. This is not a default permission of Azure DevOps, so we need to configure the appropriate permissions for the pipeline to function properly.
- Navigate back to the project created earlier:
- Click the Project Settings icon on the lower left of the screen and click the Repositories link in the fly out menu.
- Navigate to the Repositories Menu
- Type in Project Collection Build Service in the search box and select it (select the project collection with the username appended)
- The setting for the user is displayed. Select Allow for ‘Contribute’ and ensure that the green checkbox is displayed.
Create a Service Connection
These build pipelines will be importing and exporting solutions from environments that could potentially span tenants. Service Connections is where the credentials are stored for this access. This section will walk through the Service Connection creation.
- Under Project settings select “Service Connections”
- Select: “Create service connection” Button
- Select Generic. Note: For production pipelines it is recommended that you use the Power Platform Service Connection for more secure scenarios like Multi Factor Authentication (MFA)
- Specify the Server URL for the environment where we imported and setup our solution.
- Retrieve the Server URL in the Power Platform Administration Center:
- Supply the credentials for your tenants as supplied by the instructor.
- Save the Service Connection
Build Pipeline 1: Create Export from Dev
The first pipeline you will create will export your solution from your development environment as an unmanaged solution, unpack it and check it into source control (your repo)
- Start with YAML file
- Click the Create Pipeline button.
- Select “Azure Repos Git” for “Where is your code”?
- Select our Hello DevOps repo created above.
- Select the Existing Azure Pipelines YAML, file
- Select the “Export-From-Dev.YML checked in above.
- Save and run your newly created pipeline.
NOTE: If you see the following errors, you may have missed the step of adding the Power Platform Build Tools BETA above.
Congratulations you have successfully exported your solution from a Development environment to an Azure DevOps repo!
- You can validate the what the Build did by looking at the repos
Automate the configuration and data import into the Solution
As when we manually imported the solution it comes in with out the state data needed for the application. In this set of steps we are going to automate the importation of the application data.
If you didn’t manually import the data above you will need to download the data.zip file here: You can find that here:
- Under Repos > Files > Open Device Oder and select the vertical ellipsis so you can select New > File.
For the file name call it:
Note: The file name should have automatically added the “DeviceOrder” folder name in the dialog.
- Commit this file to the repo
- Add all the files from Data.zip to this folder
- Drag all three files to the: “Drag and drop files here” area and commit the three files. (The reason we didn’t just create a folder above is Git doesn’t support empty folders)
- Commit those file additions/updates.
- Deployments settings are contained in the file deploymentSettings.JSON which resides in the DeviceOrder folder.
To add this file select the vertical ellipsis to the right of the SolutionPackage folder, select upload file and upload this file:
- Commit this file addition.
Create the Deploy to Test Pipeline
Like above we will be creating the pipeline that deploys to the Test environment from an existing file.
- Download the pipeline definition file from here: https://aka.ms/build-and-deploy-to-test.yml
- Upload this pipeline definition into the repo. NOTE: This file needs to being the root of the project Repo
- Upload the build-and-deploy-to-test.yml file to the project repo
- Create a new pipeline by selecting the Pipelines Flyout menu.
- Create the pipeline based on Azure Git Repo
- Create the pipeline based on the file build-and-deploy-to-test.yml file in our project Repo
- Select use Existing Azure Pipelines YAML file option
- Select the build-and-deploy-to-test.yml file and choose “Continue.”
- Save pipeline definition.
ADD PIPELINE RENAME
Setting up the variables for the Test Deployment Pipeline
The deploy to test pipeline definition expects a username, URL and password for the environment being deployed into. This section will set the values for those variables.
- Select the variables button.
- Select New Variable
- Create a variable for the Username using the settings supplied by the instructor…or the environment you want to deploy to.
- Add another variable.
- Name it Password and set the value to the credentials supplied for the Test environment.
- Create another variable name “Url” set the value to your Test environment.
Creating the Test Service Connection
As the Test Deployment pipeline will be deploying into another environment and potentially another tenant a new Service connection is needed to maintain the connection credentials.
- Under Pipelines select “Service Connections” then the button “New Service Connection”
- Set the values of the service connection for the credentials and Server URL to the Test (Destination) Environment.
ADD PIPELINE RENAME Step
Making the pipelines dynamic through Environment Variables
A common development scenario is to stub calls to production services during the development process as to not flood those services with test calls. This process is often called Fakes. In the Device Order application, we don’t want to send the support team nonproduction support requests, so we made the email address to the support team an environment variable. In this solution we have two artifacts we want to dynamically update, the reference connection emails is being sent with and the email address you can see this variable defined as: new_SupportEmailAddress. This section will show how to have build pipelines specify these settings as they progress from Dev to Test to Production….without changing the underlying application.
This screenshot shows the environment variable in the solution
This screenshot shows the environment variable properties in the solution.
In our project these settings are held in the deploymentSettings.JSON file.
Contents of deploymentSettings.JSON
To make the properties dynamic we need to tokenize the deployment settings file and add a replace-tokens task to our pipeline
- Tokenize the deploymentSettings.JSON by adding token characters to the setting values. The replace token extension will then set these values in the pipeline at runtime.
The next step would be to set the Replace Token Extension by editing the Build Pipleline (This has been done for you)
Then set the properties of the replace tokens extension so we are looking in the solution folder and only in the deploymentSettings.JSON file. (This has been done for you)
We then need to create variables to supply the values for the ConnectionId and new_SupportEmailAddress
- Set the ConnectionID Variable Properties to: name new_sharedoffice365_1812f and value of e5c5fd57c8c845dc8fd1d063a89269eb
- Create the new_SupportEmailAddress variable and the properties name: new_SupportEmailAddress and a value of some email address.
- Run the Build-and-deploy-to-test Pipeline
Note: If your build gives the error:
“Could not load file or assembly ‘System.Management.Automation.resources”
This is a generic error in the Beta extension for ALL errors. To see what is causing the build error you will need to enable diagnostics to investigate the real cause. Common Causes is missing variables and attempting to deploy managed solutions into environments with unmanaged versions of that solution already imported. For instance, in the diagnostics, you can see the output:
##[debug]Message: Solution manifest import: FAILURE: The solution is already installed on this system as an unmanaged solution and the package supplied is attempting to install it in managed mode. Import can only update solutions when the modes match. Uninstall the current solution and try again.Detail:
For final confirmation, log into your production system and see your application!
<Greg Hurlman to add PCF section>
© 2019 Microsoft Corporation. All rights reserved.
Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.
The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein.
Microsoft and the trademarks listed at https://www.microsoft.com/enus/legal/intellectualproperty/Trademarks/Usage/General.aspx are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.
© 2019 Microsoft Corporation. All rights reserved.