A selection of Natasha’s finest art examples!
Formative Years
Gift Creations
Love of Anime
Love of Doggos
Australian Wombats
First Professional Work
A selection of Natasha’s finest art examples!
Formative Years
Gift Creations
Love of Anime
Love of Doggos
Australian Wombats
First Professional Work
Since I have stopped using twitter and the account has been hacked, people have been asking what I have been working on since retirement.
The truth is mostly (boring) home projects that I have put off to list our Redmond house for sale like clearing out a creek bed, removing tree stumps from our yard, tearing down and taking an old barn to the dump, clearing out land around our Florida house for a new fence and gates. The plan is to sell the Redmond House and “snow bird” between a pacific northwest house on the Olympic Peninsula and our house near Key West.
A couple of these projects have been “interesting” like refurbishing* a 1985 460 big block 4 speed, F350 and using it to tow a 10,000 lbs/32′ boat to Florida and even fun projects like buying a 2002 (Red) Turbo beetle for my daughter and restoring/resto modding it back to as new condition. (This picture I am polishing the headlights back to a clear state)
All that said one project I have been looking forward to the most is attempting to install a Super Bike engine into an older jet powered ski boat to make the “ultimate” shallow water flats boat for Florida.
My goals for this project are:
To this end I have been watching the Insurance salvage sites, OfferUp and Craigslist for my “donors” vehicles.
The first vehicle to show up was a 2006 K1200GT BMW that had been in a fire. I bid $25 and much to my wife’s chagrin won it!
After tearing off the melted fairings, burnt seat, destroyed fenders, replacing the wiring harness and getting new handle bars (To replace the burnt switches). I managed to get it started and have a very light 150HP, shaft drive, four stroke motor for my jet boat project!!!!!!! Woot!
Now to start looking for a donor boat that met my criteria: An older fiberglass ski boat, Jet Powered by a Berkeley pump (most common jet to ease finding parts), on a trailer and the most important part the cost needed to be near free!
The last boat I found that fit this criteria was an 18′ Apollo for $500 listed 5 years ago….With the boat shortages due to the pandemic I assumed I would be looking for at least 6months.
The plan was while looking for a donor boat I would turn the bike into a Café racer/scrambler and ride it in the neighborhood. (my neighbors would love that!).
Found a donor boat!
While scanning the local craigslist I found an for: “$300 Ski boat”. No manufacture, no age, no engine details but in looking at the pictures I can see it has a solid looking trailer and a Berkeley Jet.
So like any wise shopper I told my wife I was going to “Go shopping” hit the ATM for $300 and told the owner I wanted it -sight unseen! (Okay not that big of risk as the trailer was worth more than the asking price).
So what did I get?
While there was no identification on the hull; from pictures, I identified it as a ~1972 17-foot Jolly Rogers 17.
The Berkely Jet is a 12JC-A
When I lifted the engine cover I was shocked to find an engine!….Specifically an Oldsmobile Big Block 455, per the casting marks with factory forged internals and Hardin Manifolds!!!
Unfortunately, a previous owner had pulled the intake and distributor allowing the engine to fill with water (grrrr).
So what’s next?
Clearly I need to remove the current motor and mount the motorcycle motor into the boat….which will be interesting with the bmw motor designed to be suspended from above versus supported from below like the Olds 455.
That said I see some other issues right off:
As I will be fishing in Florida much of the Spring and Fall guessing the first water trial will be next winter.
(Yes I realize I am snow birding a little off….This schedule to spend the Holidays with the family in the Pacific Northwest)
*Pulled the Ford F350 460 big block and replaced clutch, intake manifold, Windshield, carb, fuel pump, water pump, alternator, pressure plate, valve covers, brakes, alternator, wiper motors, all the belts, new seats, navigation, master brake cylinder, heater fan, power steering pump, new gauges, rebuilt the dash…..In Retrospect not certain I would have towed such a large boat across the country in such an old truck…..but very very glad it is done!!!!
A year ago I wrote an article: Presentation skills Brown Bag in a virtual world – Sterlings (wordpress.com)
Which was inspired by brown bags I used to deliver at Microsoft (10 Techniques to help you present better – Sterlings (wordpress.com).
Which lead one of our MVP leads in the APAC region ask me:
How do you enable new speakers…particularly in a virtual world?
My pre-Covid answer would have been:
Which begs the question….how do you translate this to a virtual setting?
…With some orthogonal thinking, small tweaks and minor additions…
Amending my virtual world presentations skills suggestion list below with those enhancements:
Lisa Crosbie
to share how you or your co-presenter can help with the presentation chat/hands etc
https://www.toastmasters.org/resources/online-meeting-tips
Data Flows in Power Apps by Olena Grischenko (30 minutes)
No-code integration what? No-code data migration why? There are things that aren’t easy to un-learn in the development world. System integration and data migration are two of the most difficult parts of any business application projects. Can we solve it with No-code? Yes, we can, it a very easy and cost-effective way. You could ask Why? Why do we need new tools in the space where we solve problems in the same common way for many years? This talk is useful for developers as well as for people who don’t use code. Learn about new possibilities of Power Platform which nobody talks about even though they do exist. Learn about simple and not so simple transformations from CSV, text files, SQL to Dataverse. Become a new Power Platform data integration superhero!
Sneak Peak at Commanding for model driven Power Apps by Scott Durow (30 minutes)
Changing the commanding in model based Power Apps has traditionally been reserved for code based solutions, in this session Power Platform MVP and User Group Leader will show us a sneak peak as to how this going to be much easier in the near future. (sorry this portion is currently not planned to be recorded nor posted due to its unreleased nature)
Olena Grischenko
Australia
Technomancy – make systems work like magic: https://technomancy.com.au/ #PowerLabs – start with Microsoft technologies! https://www.meetup.com/PowerLabs/
Scott Durow
Canada
Hi, I’m Scott and I am truly passionate about helping people get the most out of the Microsoft Power Platform
Biography
Scott is a committed and personable software architect/technologist with a successful track record for realising business vision through enterprise/application architectures that are tightly aligned with budget and time scales. By combining his detailed technical knowledge with a clear grasp of the wider commercial issues Scott is able to identify and implement practical solutions to real business problems. Scott is also an excellent communicator and technical author.
Last October I was lucky enough to get a place in Florida on Little Torch Key and spent a lot of my winter chasing grouper, snapper and particularly Tarpon with the gear I brought with me from Washington.
While I caught a lot of fish and some nice juvenile Tarpon while in Florida I came to realize that the gear I brought down wasn’t ideal as it was designed for Pacific Ocean Sea Bass, Ling Cod and Salmon….All with moderate to slow action rods and reals that are designed more for drop and trolling than casting 100’s of times per trip.
For light gear this was easily resolved with getting a 7 fast extension rod for my Azores-4000 Series spinning reel. -This set up with a ¼ jig head* and a 4″ paddle tail is my go-to for practically everything around Little Torch Key.
For larger fish, medium sized tackle and fishing live baits I would trade back and forth from my Okuma Komodo 364 (just an “okay” casting reel) to my Lexa 400 (a little too big for casting those ¼ jig heads) with one of my custom three-piece rods I wrapped….
While this medium sized setup “worked”, due to the size I found myself always gravitating back to the light gear….and ended up hooking enough large sharks and monster Barracuda on that light set up to realize this 4000-class spinning gear was outclassed for a lot of the fish I wanted to be targeting.
Clearly a new Medium-Heavy class setup was called for!
As I wrap my own rods, picking a heavy fast action 7′ – 7’6″ blank and acid wrap it, is an easy solution.
The challenge came to selecting the reel as there a several great choices and not certain how they really line up.
Starting with what I didn’t like in my current gear:
With that information went out and collected all the information about the Shimano and Daiwa reels in this size.
I eliminated all the Okuma’s and Penn reels ‘s as they do not have disengaging level winds and removed ABU Garcia as don’t see many of them in the shops down in the Keys and don’t seem to be as well known for Saltwater.
With that being the case let’s take a look at the characteristics of these reels!
company |
model |
weight |
price |
line capacity 14lb |
Line Crank |
URL |
Daiwa |
ProRex 400 |
12.2 |
$ 319.00 |
284.375 |
43 |
https://daiwa.us/collections/baitcasting-reels/products/prorex-tw |
Daiwa |
Lexa 300 |
11.6 |
$ 200.00 |
190 |
32 |
https://daiwa.us/collections/baitcasting-reels/products/lexa-wn |
Daiwa |
Lexa 400 |
16.4 |
$ 249.00 |
308.75 |
33 |
https://daiwa.us/collections/baitcasting-reels/products/lexa-wn |
Daiwa |
Tatula 300 |
11.5 |
$ 269.00 |
215 |
29 |
https://daiwa.us/collections/baitcasting-reels/products/tatula-300 |
Shimano |
Tranx 300 |
11.6 |
$ 279.00 |
180 |
30 |
|
Shimano |
Tranx 400 |
12 |
$ 299.00 |
260 |
40 |
|
Shimano |
Curado 300 |
10.6 |
$ 199.00 |
180 |
35 |
While weight isn’t really the issue, it is often an indicator for a reel’s size and the ability to be palmed while casting and should give insight into how comfortable it will be for 8+ hours of casting.
While the Lexa 400 is clearly the standout heaviest reel in the collection is was amazing is how similar the rest of these reels were at that 11-12 ounces and it seemed any of these reels would work in terms of size and weight.
Line Capacity
As this is to be my medium-heavy rig it needed the line capacity to let fish run! With most of the reel weights being the same it seemed having a reel with larger line capacity being preferred so either the Prorex or Tranx 400.
Like weight is an indicator of size price can inform you of build quality…Must admit surprised to see the ProRex more expensive than the Tranx!
External Anti-Lash Brakes
Must admit I started this effort to justify buying a Tranx but after I found out Shimano uses an anti lash braking system that forces taking the side cover off leveled the playing field!
In Florida I fish a very small (13′) technical skiff and have found “If something can fall into the water…it will!”
Conclusion
After using a couple of Lexa’s and Curado’s I was confident any of these reels would be a great companion for my slightly too large Lexa 400.
After looking at the data and their price points, I decided on getting a Tatula 300.
While looking for a good price on a Tatula 300 I found a Prorex 400 sitting in SportCo for $230!!!
While not the reel I had decided on; getting the most expensive reel in the roundup for the cheapest price was simply too good a deal to pass up!
Winner: Prorex 400 Daiwa
*One issue I am finding here in the USA is most the ¼ jig heads have small wire hooks….In Australia it is pretty common to have 1/8 and ¼ jig heads with larger 5/0 and 7/0 hooks
There are over 400 connectors to connect to Power Apps to almost every conceivable service, database or back end. This article is how to connect to end points that don’t have a custom connector or custom purpose-built Azure functions. The documentation does a great job of walking through creating an Azure Function. https://docs.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-csharp. This walk through will assume you have followed these directions to create an Azure Function -and start from there.
Select the Azure Functions extension for Visual Studio Code and add a new function using the lighting bolt glyph.
In this example we have named the function “SydneyWeather”
In this example we have used the name “Getweather.Function”
Since this is a demo, set the access to anonymous.
In this lab is updating the function created earlier in the https://docs.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-csharp lab.
(Not this is for demo purposes only, normally one would not check in directly to a production site)
In this example the function was deployed to a function with the name “GetTemperature” therefore can be called using the following URL
https://gettemperature.azurewebsites.net/api/Sydneyweather
This returned the following:
“This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.”
If a query string is passed in like the following:
https://gettemperature.azurewebsites.net/api/Sydneyweather?name=Chuck
Will return the following:
“Hello, Chuck. This HTTP triggered function executed successfully.”
While Azure Functions extension for Visual Studio Code makes it easy to create an HTTP based function the default template return JSON…So these functions this can be called from a Flow in Power Automate, Power Apps requires all connectors return JSON.
Click on the files view of Visual Studio code and the SydneyWeather function
Comment out the template return and paste in the following code:
//return new OkObjectResult(responseMessage);
var myObj = new {name = “Mostly Sunny”, location = “Sydney”};
var jsonToReturn = JsonConvert.SerializeObject(myObj);
return new OkObjectResult(jsonToReturn);
This should look like the following image:
Running the updated browser will now return the following JSON
Congratulations your Azure function is now ready to be called by Power Apps!
Change the verb to “Get” and past in your function URL….From the example above it would be:
https://gettemperature.azurewebsites.net/api/sydneyweather
Then click impot
Using the browser run response, paste in the response to the body and select import. In the example above it would be:
{"name":"Mostly Sunny","location":"Sydney"}
Add a button and set the text of the button to the custom connector name, namespace and function name.
In this example it would be:
SydneyWeather.SydneyWeather().name
Congratulations you have created an Azure function, a custom connector and called them from Power Apps!
Situated along the continental US’s only tropical reef at Little Torch Key, Paradise Blue is a diver or fisherman’s dream vacation location.
For scuba divers Paradise blue is located near Looe Key Florida Marine sanctuary
Paradise Blue Features:
Winter is an inshore fisherman’s dream with Speckled Trout, Mackerel Kingfish and grouper in the inshore reefs’s,
Spring is the beginning of the Tarpon and Permit seasons with the offshore Black Fin Tuna very willing to visit
Summer is when the offshore game fish and Flats for Bonefish and Tarpon really light up!
All year Yellow Fin, Mutton and Mangrove snapper await your angling pleasure!
Looe Key is a coral reef located within the Florida Keys National Marine Sanctuary. It lies to the south of Little Torch and Big Pine Key. This reef is within a Sanctuary Preservation Area (SPA). Part of Looe Key is designated as “Research Only,” an area which protects some of the patch reefs landward of the main reef.
The reef is named after HMS Looe, which ran aground on the reef and sank in 1744.
Elk Horn Coral at Looe Key
Paradise Blue offers its residents a private pool, hot tub and sun bathing station on the canal for any sun worshipper looking to achieve a golden hue!
Paradise Blue boast a tiki bar, cards, board games, Kayak, Stand up Paddle Board, barbecue, a covered sitting lounge, a large screened porch, a large screen TV, Xbox One S with over 50 games, an ice maker capable of making 100 lbs of ice per day residents should never want for things to do.
With three bars, several restaurants, two tackle shops and a large grocery store with in 5 miles convinces are never far away.
Little Torch Key is an island in the lower Florida Keys.
Situated along U.S. Route 1 (also known as the Overseas Highway), crosses the key at about mile markers 28—29. It is immediately preceded to the northeast by Big Pine Key, and is followed by Middle Torch Key to the southwest.[3]
A small island 24 miles (39 km) from Key West, Little Torch Key is home primarily to locals, living and working from Big Pine Key to Key West. The island is also host to visitors who don’t mind a commute to the popular destination of Key West. There are a few, but not many businesses on the island, including restaurants and lodging.
Like all of the keys in the Torch Keys, this key was probably named for the native torchwood tree, Amyris elemifera L. The north end of the key is the site of a former settlement which was abandoned in 1938 when the highway was relocated.
Its most likely claim to fame is as a relatively frequent fishing destination for U.S. President Harry S. Truman. A Reuters story on February 14, 2009, named a resort there as one of the “Top 10 most romantic retreats”.
For booking Paradise Blue please visit:
https://www.keyswidevacationrentals.com
This solution simulates how a large organization might supply their employee’s hardware needs. This offering consists of a secure application that enables comparing and ordering company confidential configurations and pricing and can run on mobile devices yet offers support that takes advantage of global/public facing locations and knowledge bases. This training will focus on automating the build and deployment of this application.
The business requirements for the application are:
Task 1: Download the Power Apps Solution file
This application will be created in a dedicated environment that enables architectural, security and organization separation and geographic specificity.
Overview:
This hand-on lab will give you an opportunity to get hands on with the best practices to get your app into source control, generate a managed solution from Source (your build artifact) and finally deploy the app into another environment. You will need access to 3 Dataverse environments (Development, Build & Production) along with Azure DevOps to automate deployments.
Lab Setup:
You will need to create three environments in your demo or customer tenant. To do this follow these instructions:
ALM Hands-on Lab Overview:
During this lab you will use the account, login and environments you created in the previous steps. You will get hands-on with the full set of Application Lifecycle Management (ALM) capabilities for the Power Platform. Learn how to use the key concepts that allow customers and partners to manage solutions and deploy them across environments. Additionally, get up to speed with the details of canvas apps and flows in solutions.
To enable moving your application across environments, using components and custom entities created earlier, this lab leverages a solution you can find here: https://aka.ms/AIADSolution
Note: this may take a couple of minutes
Note: importing the solution will now have you create new connections in this environment for ones that were in the solution.
This process also populates environment variables in the solution.
These steps will populate the Dataverse Tables brought in with the solution import using PowerShell. Note these same steps can be done manually using the DataMigrationUtility tool.
Install-Module -Name Microsoft.Xrm.Tooling.PackageDeployment.Powershell -RequiredVersion 3.3.0.833
After prompted to run and a successful install you should be run the following
$cred = Get-Credential
$crmConn = Get-CrmConnection -OrganizationName chass-Dev -OnLineType Office365 -Credential $cred
import-CrmDataFile -CrmConnection $crmConn -Datafile “c:\alm\Data.zip” -Verbose
(note this will take a couple of minutes)
Congratulations you have earned the badge: “Importing Dataverse data”
IMPORTANT:
Do not proceed before going through the lab pre-requisite steps
This lab will create new Device ordering application to replace an existing paper and email-based system.
Power Apps Canvas Studio is available as a web application (http://make.powerapps.com) that you can use in any modern browser.
Power Apps Studio is designed to have a user interface familiar to users of the Office suite. It has three panes and a ribbon that make app creation feel like building a slide deck in PowerPoint. Formulas are entered within a function bar that is like Excel. Studio components:
The solution imported was a read only view, in this exercise we are going to update it to take orders and write them into the Orders table.
‘Device Gallery’.Selected.’Device Name’
User().Email
Text(‘Device Gallery’.Selected.Price,“$.00”)
SubmitForm(OrderForm)
NOTE: If you try running your application now, you will find your order form disappears!!!!!!!!!!!!!!!…Let’s “fix” this!!
NewForm(OrderForm)
Congratulations badge: “Writing Data to practically any data source!”
achieved!!!!!!!!!!
Using Environment Variables
The device Ordering application offers a link for customer to request support. While in the development phase the team doesn’t want to send test requests to the Support Team but they also don’t want to make application changes for their production deployment-which is what they are doing now.
This section will walk through updating the application use environment variables solves this problem.
For more information about using environment variables directly in Canvas applications see this great community article:
Working with Environment Variables in Canvas Power Apps and Power Automate Flows | The CRM Chap
In this section of the hands-on lab, you will create an Azure DevOps project, setup permissions and create the required pipelines to automatically export your app (as an unmanaged solution) from a development environment, generate a build artifact (managed solution) and finally deploy the app into production. The lab will also introduce you to Microsoft Power Platform Build Tools.
We are going to use Azure DevOps for both source code storage and build and deployment automation. You can use any automated source control and build automation tools using the same principles. Your configuration data should be exported from a development environment, processed by Package Deployer and checked into source control.
For the Pilot the instructor will give you these credentials i.e. Mike@Powermvps.com using the organization https://dev.azure.com/PPDevelopment
For the Pilot we will be using the organization: https://dev.azure.com/PPDevelopment
Below is a screenshot of the repo post initialization.
In the pilot class we are using a Beta of the Power Platform Build Tools. To enable this, we are all sharing one Azure DevOps Organization and this section has been done for you.
Soon you will be doing this in your own organizations using the Azure DevOps Market Place.
AFTER THE EVENT YOU will find the Power Platform Build Tools in the Azure DevOps marketplace.
In addition to creating build pipelines, we will be updating environment variables and will be using the “Replace Tokens” extension, this too needs to be installed.
Configure Azure DevOps Permissions for Build Service Account
The build pipelines that we set up later will be exporting files from an org and checking them into your source code repo. This is not a default permission of Azure DevOps, so we need to configure the appropriate permissions for the pipeline to function properly.
These build pipelines will be importing and exporting solutions from environments that could potentially span tenants. Service Connections is where the credentials are stored for this access. This section will walk through the Service Connection creation.
https://admin.powerplatform.microsoft.com/environments
The first pipeline you will create will export your solution from your development environment as an unmanaged solution, unpack it and check it into source control (your repo)
NOTE: If you see the following errors, you may have missed the step of adding the Power Platform Build Tools BETA above.
Congratulations you have successfully exported your solution from a Development environment to an Azure DevOps repo!
As when we manually imported the solution it comes in with out the state data needed for the application. In this set of steps we are going to automate the importation of the application data.
If you didn’t manually import the data above you will need to download the data.zip file here: You can find that here:
For the file name call it:
ConfigurationMigrationData/data.xml
Note: The file name should have automatically added the “DeviceOrder” folder name in the dialog.
Deployment Settings
To add this file select the vertical ellipsis to the right of the SolutionPackage folder, select upload file and upload this file:
https://aka.ms/deploymentSettings.json
Like above we will be creating the pipeline that deploys to the Test environment from an existing file.
ADD PIPELINE RENAME
Setting up the variables for the Test Deployment Pipeline
The deploy to test pipeline definition expects a username, URL and password for the environment being deployed into. This section will set the values for those variables.
As the Test Deployment pipeline will be deploying into another environment and potentially another tenant a new Service connection is needed to maintain the connection credentials.
ADD PIPELINE RENAME Step
A common development scenario is to stub calls to production services during the development process as to not flood those services with test calls. This process is often called Fakes. In the Device Order application, we don’t want to send the support team nonproduction support requests, so we made the email address to the support team an environment variable. In this solution we have two artifacts we want to dynamically update, the reference connection emails is being sent with and the email address you can see this variable defined as: new_SupportEmailAddress. This section will show how to have build pipelines specify these settings as they progress from Dev to Test to Production….without changing the underlying application.
This screenshot shows the environment variable in the solution
This screenshot shows the environment variable properties in the solution.
In our project these settings are held in the deploymentSettings.JSON file.
Contents of deploymentSettings.JSON
To make the properties dynamic we need to tokenize the deployment settings file and add a replace-tokens task to our pipeline
{
The next step would be to set the Replace Token Extension by editing the Build Pipleline (This has been done for you)
Then set the properties of the replace tokens extension so we are looking in the solution folder and only in the deploymentSettings.JSON file. (This has been done for you)
We then need to create variables to supply the values for the ConnectionId and new_SupportEmailAddress
Note: If your build gives the error:
“Could not load file or assembly ‘System.Management.Automation.resources”
This is a generic error in the Beta extension for ALL errors. To see what is causing the build error you will need to enable diagnostics to investigate the real cause. Common Causes is missing variables and attempting to deploy managed solutions into environments with unmanaged versions of that solution already imported. For instance, in the diagnostics, you can see the output:
##[debug]Message: Solution manifest import: FAILURE: The solution is already installed on this system as an unmanaged solution and the package supplied is attempting to install it in managed mode. Import can only update solutions when the modes match. Uninstall the current solution and try again.Detail:
For final confirmation, log into your production system and see your application!
<Greg Hurlman to add PCF section>
© 2019 Microsoft Corporation. All rights reserved.
Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.
The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein.
Microsoft and the trademarks listed at https://www.microsoft.com/enus/legal/intellectualproperty/Trademarks/Usage/General.aspx are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.
© 2019 Microsoft Corporation. All rights reserved.
<DRAFT!!!!!!!!!>
The suggested best practice to backing up and recovering or ALM deployments of Power Apps is to use solutions.
Unfortunately exporting and importing a solution doesn’t bring the application data with the entity(Table) definitions.
I have been playing with the ALM workflows and wanted to share how you can automate data retrieval when importing your solutions.
The first step is to use Power Apps solutions for your application.
The next step is to extract the schema and data that your application relies on to run. (If you don’t have a solution that uses data skip this step)
While you can also automate this (see below the Power shell automation process) I did it manually with the DataMigrationUtility that comes with the
Download tools from NuGet (Developer Guide for Dynamics 365 Customer Engagement) | Microsoft Docs
(DataMigrationUtility tool)
The next step is to import your solution into a new environment. If you don’t have a solution to play with I have created a solution a for the App in a Day training that contain tables for the Devices and Manufactures.
You can find that here: <link>
But like I mentioned above the application is imported with out the data it needs to work. To populate those entities (Tables) we could do it manually using the DatamigrationUIlity Tool like I did to export the data but since we will likely want to automate this let’s use Pwower shell:
$cred = Get-Credential
$crmConn = Get-CrmConnection -OrganizationName ContosoEnviroment -OnLineType Office365 -Credential $cred
Import-CrmDataFile -CrmConnection app -Datafile “Data.zip” -Verbose
In addition to the over 500 community samples a lot of people don’t realize the product comes with a plethora (in case you ever wondered how many is a “Plethora”: in this case it is 34<g>) of amazing samples and templates.
https://powerusers.microsoft.com/t5/Community-App-Samples/bd-p/AppFeedbackGallery
To use the built-in samples and templates, log into PowerApps.com.
Then from the home screen select “All Templates”.
Note: This the same selecting “Create” on the left most navigation.
Scroll about 1/3 of the way down and you will find the built in application templates.
If you are looking for a suggestion/starting point the training template is hard to beat!
Note: this does not include the Project Oakdale templates!
Those are even easier to locate and use as they are called out in the documentation here!
https://docs.microsoft.com/en-us/powerapps/teams/use-sample-apps-from-teams-store
Remember if you have need training on Power Apps you can find that here: https://aka.ms/powerappstraining