BMW K1200GT Powered Jet Boat?

Since I have stopped using twitter and the account has been hacked, people have been asking what I have been working on since retirement.

The truth is mostly (boring) home projects that I have put off to list our Redmond house for sale like clearing out a creek bed, removing tree stumps from our yard, tearing down and taking an old barn to the dump, clearing out land around our Florida house for a new fence and gates. The plan is to sell the Redmond House and “snow bird” between a pacific northwest house on the Olympic Peninsula and our house near Key West.

A couple of these projects have been “interesting” like refurbishing* a 1985 460 big block 4 speed, F350 and using it to tow a 10,000 lbs/32′ boat to Florida and even fun projects like buying a 2002 (Red) Turbo beetle for my daughter and restoring/resto modding it back to as new condition. (This picture I am polishing the headlights back to a clear state)



All that said one project I have been looking forward to the most is attempting to install a Super Bike engine into an older jet powered ski boat to make the “ultimate” shallow water flats boat for Florida.

My goals for this project are:

  1. Quiet at idle
  2. Shallow running
  3. Capable of ~35 MPH
  4. Light weight ( so it runs shallow and I can lift it with boat lift) .
  5. Doesn’t burn a ton of fuel
  6. Large enough to take my wife and daughter snorkeling



To this end I have been watching the Insurance salvage sites, OfferUp and Craigslist for my “donors” vehicles.

The first vehicle to show up was a 2006 K1200GT BMW that had been in a fire. I bid $25 and much to my wife’s chagrin won it!

After tearing off the melted fairings, burnt seat, destroyed fenders, replacing the wiring harness and getting new handle bars (To replace the burnt switches). I managed to get it started and have a very light 150HP, shaft drive, four stroke motor for my jet boat project!!!!!!! Woot!


Now to start looking for a donor boat that met my criteria: An older fiberglass ski boat, Jet Powered by a Berkeley pump (most common jet to ease finding parts), on a trailer and the most important part the cost needed to be near free!

The last boat I found that fit this criteria was an 18′ Apollo for $500 listed 5 years ago….With the boat shortages due to the pandemic I assumed I would be looking for at least 6months.

The plan was while looking for a donor boat I would turn the bike into a Café racer/scrambler and ride it in the neighborhood. (my neighbors would love that!).


Found a donor boat!

While scanning the local craigslist I found an for: “$300 Ski boat”. No manufacture, no age, no engine details but in looking at the pictures I can see it has a solid looking trailer and a Berkeley Jet.

So like any wise shopper I told my wife I was going to “Go shopping” hit the ATM for $300 and told the owner I wanted it -sight unseen! (Okay not that big of risk as the trailer was worth more than the asking price).

So what did I get?

While there was no identification on the hull; from pictures, I identified it as a ~1972 17-foot Jolly Rogers 17.

The Berkely Jet is a 12JC-A

When I lifted the engine cover I was shocked to find an engine!….Specifically an Oldsmobile Big Block 455, per the casting marks with factory forged internals and Hardin Manifolds!!!

Unfortunately, a previous owner had pulled the intake and distributor allowing the engine to fill with water (grrrr).

So what’s next?

Clearly I need to remove the current motor and mount the motorcycle motor into the boat….which will be interesting with the bmw motor designed to be suspended from above versus supported from below like the Olds 455.

That said I see some other issues right off:

  1. The hull is much deeper V than I was hoping and likely will run deeper than wanted….But will want to put the boat in the water with the new motor to see if this needs “fixed” (Remember boat is going to be losing a bunch of weight in the conversion)
  2. The Berkeley type “A” pitch impeller is designed for a big block Oldsmobile with ~225HP, ~400 ft/lbs Torque @4000RPM versus the BMW with ~150hp and ~100 ft/lbs Torque @8000rpm. The correct solution seems to get a different pitch impeller like an “E” impeller…..but since the bike has a 6 speed transmission wondering if I can just run it in 1st. (With the Olds 455 it was likely a 45MPH boat….I only want 35MPH)
  3. While the boat bottom seems solid, the transom looks rotten (to be expected for a $300 boat<sad smile>


As I will be fishing in Florida much of the Spring and Fall guessing the first water trial will be next winter.

(Yes I realize I am snow birding a little off….This schedule to spend the Holidays with the family in the Pacific Northwest)





*Pulled the Ford F350 460 big block and replaced clutch, intake manifold, Windshield, carb, fuel pump, water pump, alternator, pressure plate, valve covers, brakes, alternator, wiper motors, all the belts, new seats, navigation, master brake cylinder, heater fan, power steering pump, new gauges, rebuilt the dash…..In Retrospect not certain I would have towed such a large boat across the country in such an old truck…..but very very glad it is done!!!!

How to enable new speakers in a global pandemic world


A year ago I wrote an article: Presentation skills Brown Bag in a virtual world – Sterlings (

Which was inspired by brown bags I used to deliver at Microsoft (10 Techniques to help you present better – Sterlings (

Which lead one of our MVP leads in the APAC region ask me:

How do you enable new speakers…particularly in a virtual world?


My pre-Covid answer would have been:

  1. Partner with these would be speakers (co presenting)
  2. Pick a topic that sets them up to succeed
  3. Explain how they fit into the overall presentation
  4. Scope their delivery to something small (i.e. a demo) BUT ensure you have the flexibility to allow them continue if needed.

Which begs the question….how do you translate this to a virtual setting?

…With some orthogonal thinking, small tweaks and minor additions…

Amending my virtual world presentations skills suggestion list below with those enhancements:



  • Share the stage

    • Bring a friend!

    Lisa Crosbie
    to share how you or your co-presenter can help with the presentation chat/hands etc

    • Additional techniques to enable sharing the stage w/ a friend (Chuck to cover)

      • “Time master”

      • “Code Line Count Master”

  • Be the story

    • Screen Share/multiple monitors is your friend by Lisa Crosbie
    • Blur Background and or Use Stock Background (Power Points rule of three++) (by Chuck)
    • Prerecord your Demo! By Lisa Crosbie

  • Engage your audience

    • Audience Size shouldn’t control your “voice”
    • Physical Activity > Now becomes Camera Framing
    • Eye Contact Activity would use erasers in the room > Post its behind the screen

  • LOUDER equals better

    • Mute your audience

  • Emphasize appropriately

    • Voting (Also a great engagement technique!)





Vancouver User Group July 22nd Data Flows and a sneak peak at new commanding v2 for model driven Power Apps by Olena Grischenko and Scott Durow


Data Flows in Power Apps by Olena Grischenko  (30 minutes)

No-code integration what? No-code data migration why? There are things that aren’t easy to un-learn in the development world. System integration and data migration are two of the most difficult parts of any business application projects. Can we solve it with No-code? Yes, we can, it a very easy and cost-effective way. You could ask Why? Why do we need new tools in the space where we solve problems in the same common way for many years? This talk is useful for developers as well as for people who don’t use code. Learn about new possibilities of Power Platform which nobody talks about even though they do exist. Learn about simple and not so simple transformations from CSV, text files, SQL to Dataverse. Become a new Power Platform data integration superhero!


Sneak Peak at Commanding for model driven Power Apps by Scott Durow (30 minutes)

Changing the commanding in model based Power Apps has traditionally been reserved for code based solutions, in this session Power Platform MVP and User Group Leader will show us a sneak peak as to how this going to be much easier in the near future.  (sorry this portion is currently not planned to be recorded nor posted due to its unreleased nature)



Olena Grischenko


Technomancy – make systems work like magic: #PowerLabs – start with Microsoft technologies!



Scott Durow


Hi, I’m Scott and I am truly passionate about helping people get the most out of the Microsoft Power Platform


Scott is a committed and personable software architect/technologist with a successful track record for realising business vision through enterprise/application architectures that are tightly aligned with budget and time scales. By combining his detailed technical knowledge with a clear grasp of the wider commercial issues Scott is able to identify and implement practical solutions to real business problems. Scott is also an excellent communicator and technical author.

Selecting a Florida medium class inshore casting reel

Last October I was lucky enough to get a place in Florida on Little Torch Key and spent a lot of my winter chasing grouper, snapper and particularly Tarpon with the gear I brought with me from Washington.

While I caught a lot of fish and some nice juvenile Tarpon while in Florida I came to realize that the gear I brought down wasn’t ideal as it was designed for Pacific Ocean Sea Bass, Ling Cod and Salmon….All with moderate to slow action rods and reals that are designed more for drop and trolling than casting 100’s of times per trip.

For light gear this was easily resolved with getting a 7 fast extension rod for my Azores-4000 Series spinning reel. -This set up with a ¼ jig head* and a 4″ paddle tail is my go-to for practically everything around Little Torch Key.

For larger fish, medium sized tackle and fishing live baits I would trade back and forth from my Okuma Komodo 364 (just an “okay” casting reel) to my Lexa 400 (a little too big for casting those ¼ jig heads) with one of my custom three-piece rods I wrapped….

While this medium sized setup “worked”, due to the size I found myself always gravitating back to the light gear….and ended up hooking enough large sharks and monster Barracuda on that light set up to realize this 4000-class spinning gear was outclassed for a lot of the fish I wanted to be targeting.

Clearly a new Medium-Heavy class setup was called for!

As I wrap my own rods, picking a heavy fast action 7′ – 7’6″ blank and acid wrap it, is an easy solution.

The challenge came to selecting the reel as there a several great choices and not certain how they really line up.

Starting with what I didn’t like in my current gear:

  • Lexa 400 Great casting reel but a little large
  • Komoda 364 Great Size, Great Drag not a great casting reel due to non-disengaging level Wind.

With that information went out and collected all the information about the Shimano and Daiwa reels in this size.

I eliminated all the Okuma’s and Penn reels ‘s as they do not have disengaging level winds and removed ABU Garcia as don’t see many of them in the shops down in the Keys and don’t seem to be as well known for Saltwater.

With that being the case let’s take a look at the characteristics of these reels!





line capacity 14lb

Line Crank



ProRex 400


$ 319.00




Lexa 300


$ 200.00




Lexa 400


$ 249.00




Tatula 300


$ 269.00




Tranx 300


$ 279.00




Tranx 400


$ 299.00




Curado 300


$ 199.00





While weight isn’t really the issue, it is often an indicator for a reel’s size and the ability to be palmed while casting and should give insight into how comfortable it will be for 8+ hours of casting.

While the Lexa 400 is clearly the standout heaviest reel in the collection is was amazing is how similar the rest of these reels were at that 11-12 ounces and it seemed any of these reels would work in terms of size and weight.

Line Capacity

As this is to be my medium-heavy rig it needed the line capacity to let fish run! With most of the reel weights being the same it seemed having a reel with larger line capacity being preferred so either the Prorex or Tranx 400.


Like weight is an indicator of size price can inform you of build quality…Must admit surprised to see the ProRex more expensive than the Tranx!


External Anti-Lash Brakes

Must admit I started this effort to justify buying a Tranx but after I found out Shimano uses an anti lash braking system that forces taking the side cover off leveled the playing field!

In Florida I fish a very small (13′) technical skiff and have found “If something can fall into the water…it will!”



After using a couple of Lexa’s and Curado’s I was confident any of these reels would be a great companion for my slightly too large Lexa 400.

After looking at the data and their price points, I decided on getting a Tatula 300.

While looking for a good price on a Tatula 300 I found a Prorex 400 sitting in SportCo for $230!!!

While not the reel I had decided on; getting the most expensive reel in the roundup for the cheapest price was simply too good a deal to pass up!

Winner: Prorex 400 Daiwa

*One issue I am finding here in the USA is most the ¼ jig heads have small wire hooks….In Australia it is pretty common to have 1/8 and ¼ jig heads with larger 5/0 and 7/0 hooks

Creating Azure Functions, Custom Connectors and using them inside of Power apps

There are over 400 connectors to connect to Power Apps to almost every conceivable service, database or back end. This article is how to connect to end points that don’t have a custom connector or custom purpose-built Azure functions. The documentation does a great job of walking through creating an Azure Function. This walk through will assume you have followed these directions to create an Azure Function -and start from there.

Creating the Azure Function

Step 1. Create a new Function using the Azure Functions Extension.

Select the Azure Functions extension for Visual Studio Code and add a new function using the lighting bolt glyph.

Step 2. Set the function to use HttpTrigger trigger

Step 3. Name the Azure function.

In this example we have named the function “SydneyWeather”

Step 4. Supply the name space name.

In this example we have used the name “Getweather.Function”

Step 5. Set the Permission Access type.

Since this is a demo, set the access to anonymous.

Step 6. Check in the Function to version control.

Step 7. Deploy this function into the Azure Subscription.

In this lab is updating the function created earlier in the lab.

(Not this is for demo purposes only, normally one would not check in directly to a production site)

Step 8. Run the Function interactively in a browser.

In this example the function was deployed to a function with the name “GetTemperature” therefore can be called using the following URL

This returned the following:

    “This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.”


If a query string is passed in like the following:

Will return the following:

    “Hello, Chuck. This HTTP triggered function executed successfully.”


While Azure Functions extension for Visual Studio Code makes it easy to create an HTTP based function the default template return JSON…So these functions this can be called from a Flow in Power Automate, Power Apps requires all connectors return JSON.


Update the function to return JSON

Step 9. Open the function implementation.

Click on the files view of Visual Studio code and the SydneyWeather function

Step 10. Edit the function to return JSON

Comment out the template return and paste in the following code:

            //return new OkObjectResult(responseMessage);

    var myObj = new {name = “Mostly Sunny”location = “Sydney”};

    var jsonToReturn = JsonConvert.SerializeObject(myObj);

            return new OkObjectResult(jsonToReturn);


This should look like the following image:

Step 11. Checkin this update.

Step 12. Deploy these updates:


Step 13. Run the function interactively from the Browser.

Running the updated browser will now return the following JSON

Congratulations your Azure function is now ready to be called by Power Apps!

Creating a Power Platform custom connector to reference the Azure Function.

Step 14. Log in to Power Apps.


Step 15. Navigate to the Custom Connectors hub under the Data Menu


Step 16. Select New Custom Connector > From Blank Template

Supply a connection name….in this case I simply called it “SydneyWeather”.

Step 17. Supply the initial Connector information.


Step 18. Continue Past the security section (no need to set anything)

Step 19. Set the general definition data

Step 20. Create the request via “Import from Sample”

Change the verb to “Get” and past in your function URL….From the example above it would be:

Then click impot



Step 21. Create the response from example body


Using the browser run response, paste in the response to the body and select import. In the example above it would be:

        {"name":"Mostly Sunny","location":"Sydney"}


Step 22. Proceed to Test

Step 23. Create the Custom Connector and a connection to it.

Step 24. Test the operation.


Using the connector from Power Apps

Step 25 Create a new Canvas Application.

Step 26. Navigate to the Connectors.

Step 27. Select our custom connector.



Step 28. Use the custom connector to invoke the Azure function.

Add a button and set the text of the button to the custom connector name, namespace and function name.

In this example it would be:



Congratulations you have created an Azure function, a custom connector and called them from Power Apps!

Enjoy a tropical vacation at Paradise Blue

Situated along the continental US’s only tropical reef at Little Torch Key, Paradise Blue is a diver or fisherman’s dream vacation location.

For scuba divers Paradise blue is located near Looe Key Florida Marine sanctuary

Paradise Blue Features:





Florida Keys holds more fishing world records than any other location and fishing from Paradise Blue it is easy to see why.

Winter is an inshore fisherman’s dream with Speckled Trout, Mackerel Kingfish and grouper in the inshore reefs’s,

Spring is the beginning of the Tarpon and Permit seasons with the offshore Black Fin Tuna very willing to visit

Summer is when the offshore game fish and Flats for Bonefish and Tarpon really light up!

All year Yellow Fin, Mutton and Mangrove snapper await your angling pleasure!


Scuba Diving

Looe Key is a coral reef located within the Florida Keys National Marine Sanctuary. It lies to the south of Little Torch and Big Pine Key. This reef is within a Sanctuary Preservation Area (SPA). Part of Looe Key is designated as “Research Only,” an area which protects some of the patch reefs landward of the main reef.











The reef is named after HMS Looe, which ran aground on the reef and sank in 1744.

Elk Horn Coral at Looe Key

Sun Bathing

Paradise Blue offers its residents a private pool, hot tub and sun bathing station on the canal for any sun worshipper looking to achieve a golden hue!


Paradise Blue boast a tiki bar, cards, board games, Kayak, Stand up Paddle Board, barbecue, a covered sitting lounge, a large screened porch, a large screen TV, Xbox One S with over 50 games, an ice maker capable of making 100 lbs of ice per day residents should never want for things to do.


Nearby Amenities

With three bars, several restaurants, two tackle shops and a large grocery store with in 5 miles convinces are never far away.


Little Torch Key is an island in the lower Florida Keys.

Situated along U.S. Route 1 (also known as the Overseas Highway), crosses the key at about mile markers 28—29. It is immediately preceded to the northeast by Big Pine Key, and is followed by Middle Torch Key to the southwest.[3]

A small island 24 miles (39 km) from Key West, Little Torch Key is home primarily to locals, living and working from Big Pine Key to Key West. The island is also host to visitors who don’t mind a commute to the popular destination of Key West. There are a few, but not many businesses on the island, including restaurants and lodging.

Like all of the keys in the Torch Keys, this key was probably named for the native torchwood tree, Amyris elemifera L. The north end of the key is the site of a former settlement which was abandoned in 1938 when the highway was relocated.

Its most likely claim to fame is as a relatively frequent fishing destination for U.S. President Harry S. Truman. A Reuters story on February 14, 2009, named a resort there as one of the “Top 10 most romantic retreats”.


For booking Paradise Blue please visit:

Power Platform Developer Training

Application Introduction: End to end employee hardware ordering solution.  

This solution simulates how a large organization might supply their employee’s hardware needs. This offering consists of a secure application that enables comparing and ordering company confidential configurations and pricing and can run on mobile devices yet offers support that takes advantage of global/public facing locations and knowledge bases. This training will focus on automating the build and deployment of this application.

The business requirements for the application are:


  1. Securely displays only organization IT approved devices and prices.
  2. Runs on both web and mobile devices.
  3. Friction free and intuitive device identification.
  4. Streamlined order and approval process.
  5. Built by the internal IT group who does not have traditional development resources.
  6. Can be integrated and hand off to the company’s governance and updating process as to enable GDPR, Compliance and Accessibility.


Technologies utilized to achieve the business requirements.

  • Dataverse: Make it easier to bring your data together and quickly create powerful apps using a compliant and scalable data service and app platform that’s integrated into PowerApps.
  • PowerApps: A software as a service application platform that enables power users in line of business roles to easily build and deploy custom business apps. You will learn how to build both Canvas and Model-driven style of apps.
  • Power Automate to enable loosely coupling business logic from presentation tiers.
  • Azure DevOps for the automation of updates and deployment
  • Extending the application using Power Apps Component Framework controls



  1. Create a new environment with a Dataverse Database.
    1. Use the solution to create the entity, bring in components.
    2. Import configuration Data from web-based data source and creating entities into Data.
  2. Create a Power Apps canvas-based application with components.
  3. Exporting Solution from Power Apps environment to Source Code
  4. Deploying the solution from Source Code into a new environment
  5. Using Environment variables in the deployment pipeline to update the Solution.
  6. Adding a PCF Control (Greg Hurlman)


Pre-requisites: Before starting the hands-on lab

Task 1: Download the Power Apps Solution file

Setting up the environment and data

This application will be created in a dedicated environment that enables architectural, security and organization separation and geographic specificity.


This hand-on lab will give you an opportunity to get hands on with the best practices to get your app into source control, generate a managed solution from Source (your build artifact) and finally deploy the app into another environment. You will need access to 3 Dataverse environments (Development, Build & Production) along with Azure DevOps to automate deployments.

Lab Setup:

You will need to create three environments in your demo or customer tenant. To do this follow these instructions:

  1. Login to a tenant that you have access to and that minimum 3GB available capacity which is required to create 3 environments.
  2. Go to , this will take you to the admin center
  3. Select Environments in the navigation area

  4. Select “+ New Environment” to create your first new environment.

  5. The first environment should be named Your Name – dev”, set the region to “United States (default)”, set the Environment type to “Production” (if available) if not use “Trial”

  6. Select “Create environment”
  7. Now that your environment has been created select “Create database.”

  8. Set the Currency to “USD” and Language to “English”, Include the sample apps and data then select “Create database”

  9. Your development environment has been created, follow steps 4 – 8 above to create a second environment called Your Name – test”.

  10. Now you have the environments that we will need for this lab


ALM Hands-on Lab Overview:

During this lab you will use the account, login and environments you created in the previous steps. You will get hands-on with the full set of Application Lifecycle Management (ALM) capabilities for the Power Platform. Learn how to use the key concepts that allow customers and partners to manage solutions and deploy them across environments. Additionally, get up to speed with the details of canvas apps and flows in solutions.


  1. Let’s get familiar with the environments that you created (in the pre-lab).
  2. Go to to view the environments you have access to
  3. You will see your environments, one with dev in the name and one with test in the name.

  4. We will use these environments as our primary environments during the lab. Your user will have the “System Administrator” Dataverse role for all environments giving you full access to the environment.
  5. In a new tab open

  6. In the header, you will see a way to switch environments to move between your different environments.

  7. When you change to a new environment will change to show only content relevant to that environment. Both environments contain a Dataverse database. This allows you to leverage the Dataverse solution infrastructure to move app, tables, code and other resources from one environment to another.
  8. Select “Solutions” in the navigation

  9. Solutions are how customizers and developers author, package, and maintain units of software that extend Dataverse. Customizers and developers distribute solutions so that organizations can use Dataverse to install and uninstall the app(s) defined in the solution. In the next Module you will learn how to create a new solution



Importing the App in a day Solution.

To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here:

  1. Navigate to and download this file to your hard drive then select Import in the Solution home to bring this solution into your Power Apps environment.
  2. To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here:


Note: this may take a couple of minutes

  1. Publish customizations to push the Device Order custom entity definition into this environment.

Note: importing the solution will now have you create new connections in this environment for ones that were in the solution.

This process also populates environment variables in the solution.

Bringing in the Device and Manufacture Application Data

These steps will populate the Dataverse Tables brought in with the solution import using PowerShell. Note these same steps can be done manually using the DataMigrationUtility tool.

Installing the needed Power shell Module



Install-Module -Name Microsoft.Xrm.Tooling.PackageDeployment.Powershell -RequiredVersion


After prompted to run and a successful install you should be run the following


$cred = Get-Credential

$crmConn = Get-CrmConnection -OrganizationName chass-Dev -OnLineType Office365 -Credential $cred

import-CrmDataFile -CrmConnection $crmConn -Datafile “c:\alm\” -Verbose

(note this will take a couple of minutes)


  1. Navigate to the solutions view and open the Device Order Solution

Congratulations you have earned the badge: “Importing Dataverse data”


The Device Ordering Power Apps Canvas App

Do not proceed before going through the lab pre-requisite steps

This lab will create new Device ordering application to replace an existing paper and email-based system.

Power Apps Canvas Studio Layout

Power Apps Canvas Studio is available as a web application ( that you can use in any modern browser.

Power Apps Studio is designed to have a user interface familiar to users of the Office suite. It has three panes and a ribbon that make app creation feel like building a slide deck in PowerPoint. Formulas are entered within a function bar that is like Excel. Studio components:

  1. Left navigation bar, which shows all the screens, data sources, and controls in your app
  2. Middle pane, which contains the app screen you are working on
  3. Right-hand pane, where you configure properties for controls, bind to data, create rules, and set additional advanced settings
  4. Property drop-down list, where you select the property for the selected control that you want to configure
  5. Formula bar, where you add formulas (like in Excel) that define the behavior of a selected control
  6. Ribbon, where you perform common actions including customizing design elements
  7. Additional items, here you will find your environment selection, app checker, and the preview app functionality.


Add an order to Device Order Common Data entity using a form.

The solution imported was a read only view, in this exercise we are going to update it to take orders and write them into the Orders table.

  1. Open the Device Ordering Application from the solutions view.


  1. Click on Insert > Forms and select Edit

  1. Resize the edit form to the bottom of the application.

  1. Set the Data Source of the form to the “Device Orders” entity

  1. Remove the field “Created On”

  1. Add the fields to Name, Price, Request by, Requested Date

  1. Set the form to have two columns

  1. To set the properties of the name edit form; select the Name text box and the Advanced properties. Select “Unlock to change properties”

  1. Scroll down to Default property and type:

‘Device Gallery’.Selected.’Device Name’

  1. Unlock the “Requested By” property

  1. Set the “Requested By” property to


  1. Unlock the “Price” property

  1. Set the “Price” default property to:

Text(‘Device Gallery’.Selected.Price,“$.00”)


  1. Rename our edit form from Form1 to “OrderForm”


  1. Add an Icon for our save. Click on Icons menu and select “Check” and move to the bottom right of the screen as indicated below.

  1. To write the order details to the Dataverse entity: Select the Check icon and in the OnSelect type:


NOTE: If you try running your application now, you will find your order form disappears!!!!!!!!!!!!!!!…Let’s “fix” this!!

  1. To display the edit forum. Select the Device Gallery chose the “OnSelect” property and type:



  1. Click File and select Save.
  2. Click the back arrow.

Congratulations badge: “Writing Data to practically any data source!”


Using Environment Variables

The device Ordering application offers a link for customer to request support. While in the development phase the team doesn’t want to send test requests to the Support Team but they also don’t want to make application changes for their production deployment-which is what they are doing now.

This section will walk through updating the application use environment variables solves this problem.

For more information about using environment variables directly in Canvas applications see this great community article:

Working with Environment Variables in Canvas Power Apps and Power Automate Flows | The CRM Chap

Done and looking for something else to do?

  • Apply a theme to your Application for a new look at feel.
  • Add navigation to the header component.
  • Give the controls a responsive layout
  • Create your own component for the form submission.

Create an Azure DevOps Project

In this section of the hands-on lab, you will create an Azure DevOps project, setup permissions and create the required pipelines to automatically export your app (as an unmanaged solution) from a development environment, generate a build artifact (managed solution) and finally deploy the app into production. The lab will also introduce you to Microsoft Power Platform Build Tools.

We are going to use Azure DevOps for both source code storage and build and deployment automation. You can use any automated source control and build automation tools using the same principles. Your configuration data should be exported from a development environment, processed by Package Deployer and checked into source control.

  1. Log into with your credentials and click on your organization in the left panel. Follow the instructions to create a project. Click Create Project.

For the Pilot the instructor will give you these credentials i.e. using the organization

  1. Create a DevOps project by going here and selecting “Sign into Azure DevOps”

    For the Pilot we will be using the organization:


  2. Select “+ Create project”

  3. Create a Project name called “Your Name – DevOps Project”, make it a Private project and select “Create”

  4. Your DevOps project has now been created, please note the URL on the top for your project and bookmark it or save it locally

  5. You are now ready to begin the ALM Hand-on Lab


  1. When the project is completed, you will need to create a Repo to hold the source code. Click on the Repos link in the left navigation.

  1. Initialize the repo with the default README by clicking the Initialize button.

Below is a screenshot of the repo post initialization.

Install and Enable the Azure DevOps Extensions for Power Platform

In the pilot class we are using a Beta of the Power Platform Build Tools. To enable this, we are all sharing one Azure DevOps Organization and this section has been done for you.

Soon you will be doing this in your own organizations using the Azure DevOps Market Place.

  1. Navigate to the organization page, click on the Azure DevOps links then the Organization Settings link in the bottom left of the navigation panel


  2. On the organization settings page, click the Extensions link in the left navigation panel


  3. For this lab we will be using a beta build of the Power Platform Build tools to enable the use of environment variables. This beta extension has already been shared with this organization. And can be found by going to the shared Tab.

  1. Select Install

  2. Select “Go to Markeplace”



AFTER THE EVENT YOU will find the Power Platform Build Tools in the Azure DevOps marketplace.

In addition to creating build pipelines, we will be updating environment variables and will be using the “Replace Tokens” extension, this too needs to be installed.

  1. Click ‘Browse marketplace. This will redirect you to the Azure DevOps marketplace. Search for “Replace Tokens” and select the Replace Tokens extension. Fun side note this widely used and popular extension was written and maintained by a Microsoft MVP.

  1. Click ‘Get it free’.

  1. Click ‘Install’
  2. Click ‘Proceed to organization’

Configure Azure DevOps Permissions for Build Service Account

The build pipelines that we set up later will be exporting files from an org and checking them into your source code repo. This is not a default permission of Azure DevOps, so we need to configure the appropriate permissions for the pipeline to function properly.

  1. Navigate back to the project created earlier:


  1. Click the Project Settings icon on the lower left of the screen and click the Repositories link in the fly out menu.

  1. Navigate to the Repositories Menu

  1. Type in Project Collection Build Service in the search box and select it (select the project collection with the username appended)

  1. The setting for the user is displayed. Select Allow for ‘Contribute’ and ensure that the green checkbox is displayed.

Create a Service Connection

These build pipelines will be importing and exporting solutions from environments that could potentially span tenants. Service Connections is where the credentials are stored for this access. This section will walk through the Service Connection creation.

  1. Under Project settings select “Service Connections”

  1. Select: “Create service connection” Button


  1. Select Generic. Note: For production pipelines it is recommended that you use the Power Platform Service Connection for more secure scenarios like Multi Factor Authentication (MFA)

  1. Specify the Server URL for the environment where we imported and setup our solution.

  1. Retrieve the Server URL in the Power Platform Administration Center:

  1. Supply the credentials for your tenants as supplied by the instructor.

  1. Save the Service Connection

Build Pipeline 1: Create Export from Dev

The first pipeline you will create will export your solution from your development environment as an unmanaged solution, unpack it and check it into source control (your repo)

  1. Start with YAML file



  1. Click the Create Pipeline button.

  1. Select “Azure Repos Git” for “Where is your code”?

  1. Select our Hello DevOps repo created above.

  1. Select the Existing Azure Pipelines YAML, file

  1. Select the “Export-From-Dev.YML checked in above.

  1. Save and run your newly created pipeline.

NOTE: If you see the following errors, you may have missed the step of adding the Power Platform Build Tools BETA above.

Congratulations you have successfully exported your solution from a Development environment to an Azure DevOps repo!

  1. You can validate the what the Build did by looking at the repos

Automate the configuration and data import into the Solution

As when we manually imported the solution it comes in with out the state data needed for the application. In this set of steps we are going to automate the importation of the application data.

If you didn’t manually import the data above you will need to download the file here: You can find that here:

  1. Under Repos > Files > Open Device Oder and select the vertical ellipsis so you can select New > File.

For the file name call it:


Note: The file name should have automatically added the “DeviceOrder” folder name in the dialog.


  1. Commit this file to the repo

  1. Add all the files from to this folder

  1. Drag all three files to the: “Drag and drop files here” area and commit the three files. (The reason we didn’t just create a folder above is Git doesn’t support empty folders)

  1. Commit those file additions/updates.

Deployment Settings

  1. Deployments settings are contained in the file deploymentSettings.JSON which resides in the DeviceOrder folder.

To add this file select the vertical ellipsis to the right of the SolutionPackage folder, select upload file and upload this file:

  1. Commit this file addition.

Create the Deploy to Test Pipeline

Like above we will be creating the pipeline that deploys to the Test environment from an existing file.

  1. Download the pipeline definition file from here:
  2. Upload this pipeline definition into the repo. NOTE: This file needs to being the root of the project Repo


  1. Upload the build-and-deploy-to-test.yml file to the project repo

  1. Create a new pipeline by selecting the Pipelines Flyout menu.


  1. Create the pipeline based on Azure Git Repo

  1. Create the pipeline based on the file build-and-deploy-to-test.yml file in our project Repo

  1. Select use Existing Azure Pipelines YAML file option

  1. Select the build-and-deploy-to-test.yml file and choose “Continue.”

  1. Save pipeline definition.



Setting up the variables for the Test Deployment Pipeline

The deploy to test pipeline definition expects a username, URL and password for the environment being deployed into. This section will set the values for those variables.

  1. Select the variables button.

  1. Select New Variable

  1. Create a variable for the Username using the settings supplied by the instructor…or the environment you want to deploy to.

  1. Add another variable.

  1. Name it Password and set the value to the credentials supplied for the Test environment.

  1. Create another variable name “Url” set the value to your Test environment.

Creating the Test Service Connection

As the Test Deployment pipeline will be deploying into another environment and potentially another tenant a new Service connection is needed to maintain the connection credentials.


  1. Under Pipelines select “Service Connections” then the button “New Service Connection”

  1. Set the values of the service connection for the credentials and Server URL to the Test (Destination) Environment.



Making the pipelines dynamic through Environment Variables

A common development scenario is to stub calls to production services during the development process as to not flood those services with test calls. This process is often called Fakes. In the Device Order application, we don’t want to send the support team nonproduction support requests, so we made the email address to the support team an environment variable. In this solution we have two artifacts we want to dynamically update, the reference connection emails is being sent with and the email address you can see this variable defined as: new_SupportEmailAddress. This section will show how to have build pipelines specify these settings as they progress from Dev to Test to Production….without changing the underlying application.

This screenshot shows the environment variable in the solution

This screenshot shows the environment variable properties in the solution.


In our project these settings are held in the deploymentSettings.JSON file.

Contents of deploymentSettings.JSON


“ConnectionReferences”: [


“LogicalName”: “new_sharedoffice365_1812f”,

“ConnectionId”: “e5c5fd57c8c845dc8fd1d063a89269eb”,

“ConnectorId”: “/providers/Microsoft.PowerApps/apis/shared_office365”



“EnvironmentVariables”: [


“SchemaName”: “new_SupportEmailAddress”,

“Value”: “”





To make the properties dynamic we need to tokenize the deployment settings file and add a replace-tokens task to our pipeline

  1. Tokenize the deploymentSettings.JSON by adding token characters to the setting values. The replace token extension will then set these values in the pipeline at runtime.


“ConnectionReferences”: [


“LogicalName”: “new_sharedoffice365_1812f”,

“ConnectionId”: “#{new_sharedoffice365_1812f}#“,

“ConnectorId”: “/providers/Microsoft.PowerApps/apis/shared_office365”



“EnvironmentVariables”: [


“SchemaName”: “new_SupportEmailAddress”,

“Value”: “#{new_SupportEmailAddress}#




The next step would be to set the Replace Token Extension by editing the Build Pipleline (This has been done for you)

Then set the properties of the replace tokens extension so we are looking in the solution folder and only in the deploymentSettings.JSON file. (This has been done for you)

We then need to create variables to supply the values for the ConnectionId and new_SupportEmailAddress

  1. Set the ConnectionID Variable Properties to: name new_sharedoffice365_1812f and value of e5c5fd57c8c845dc8fd1d063a89269eb

  1. Create the new_SupportEmailAddress variable and the properties name: new_SupportEmailAddress and a value of some email address.

  1. Run the Build-and-deploy-to-test Pipeline


Note: If your build gives the error:

“Could not load file or assembly ‘System.Management.Automation.resources”

This is a generic error in the Beta extension for ALL errors. To see what is causing the build error you will need to enable diagnostics to investigate the real cause. Common Causes is missing variables and attempting to deploy managed solutions into environments with unmanaged versions of that solution already imported. For instance, in the diagnostics, you can see the output:

##[debug]Message: Solution manifest import: FAILURE: The solution is already installed on this system as an unmanaged solution and the package supplied is attempting to install it in managed mode. Import can only update solutions when the modes match. Uninstall the current solution and try again.Detail:






For final confirmation, log into your production system and see your application!
<Greg Hurlman to add PCF section>



© 2019 Microsoft Corporation. All rights reserved.

Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein.

Microsoft and the trademarks listed at are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.

© 2019 Microsoft Corporation. All rights reserved.


Bringing in Power Apps application Data in your ALM work flows


The suggested best practice to backing up and recovering or ALM deployments of Power Apps is to use solutions.

Unfortunately exporting and importing a solution doesn’t bring the application data with the entity(Table) definitions.

I have been playing with the ALM workflows and wanted to share how you can automate data retrieval when importing your solutions.

The first step is to use Power Apps solutions for your application.

The next step is to extract the schema and data that your application relies on to run. (If you don’t have a solution that uses data skip this step)

While you can also automate this (see below the Power shell automation process) I did it manually with the DataMigrationUtility that comes with the

Download tools from NuGet (Developer Guide for Dynamics 365 Customer Engagement) | Microsoft Docs

(DataMigrationUtility tool)

The next step is to import your solution into a new environment. If you don’t have a solution to play with I have created a solution a for the App in a Day training that contain tables for the Devices and Manufactures.

You can find that here: <link>

But like I mentioned above the application is imported with out the data it needs to work. To populate those entities (Tables) we could do it manually using the DatamigrationUIlity Tool like I did to export the data but since we will likely want to automate this let’s use Pwower shell:


$cred = Get-Credential

$crmConn = Get-CrmConnection -OrganizationName ContosoEnviroment -OnLineType Office365 -Credential $cred

Import-CrmDataFile -CrmConnection app -Datafile “” -Verbose







Finding and using the Power Apps Sample Templates

In addition to the over 500 community samples a lot of people don’t realize the product comes with a plethora (in case you ever wondered how many is a “Plethora”: in this case it is 34<g>) of amazing samples and templates.

To use the built-in samples and templates, log into

Then from the home screen select “All Templates”.

Note: This the same selecting “Create” on the left most navigation.

Scroll about 1/3 of the way down and you will find the built in application templates.

If you are looking for a suggestion/starting point the training template is hard to beat!

Note: this does not include the Project Oakdale templates!

Those are even easier to locate and use as they are called out in the documentation here!


Remember if you have need training on Power Apps you can find that here: