Making a Tweet Bot With Microsoft Flow

Standard

If you subscribe to my Twitter feed, you will have noticed a lot more activity of late. This is because I have created a Tweet Bot to find me the most interesting Dynamics articles out there and Tweet them.

My inspiration for doing this was Mark Smith’s Twitter feed (@nz365guy). Every hour Mark pumps out a Tweet, sometimes in a different language, sometimes on related technologies, such as SQL Server. He also drops in quotes from the books he is reading, as well as the odd manual Tweet.

Mark Smith Twitter

As you can see, this formula has been very successful for him. Over 11,000 followers and almost 69,000 likes on the back of 29,000 Tweets. That’s a little over two likes per Tweet. Good stuff.

Previously I had only really used Twitter to promote my blog articles so I thought it would be a perfect testbed to see if automated Tweeting, plus the odd promotion of my blogs and speaking engagements did anything to lift my own statistics.

In doing so I also found a curated list of Tweets was far more useful than browsing through the list of Tweets from the people I am Following because looking at my own list of Tweets is ad-free. Now I review the curated list and most days, if I find something I really like I post it to my LinkedIn feed. So, if you want to see something less automated, feel free to follow me on LinkedIn.

How It Works

image

Here it is. Essentially, the Flow:

  • Triggers with a pre-determined frequency
  • Initializes a bunch of variables and searches for candidate Tweets
  • Loops through the Tweets to find the best one
  • Stores the winning Tweet in a list of sent Tweets and then Tweets it

Let us go through these stages in more detail.

Recurrence

This seems pretty straightforward but there are a couple of things to consider. Firstly, if I did like Mark and scheduled to send one every hour, this would be around 24*30 = 720 Tweets per month which is close to my quota of 750 on a free plan. Do-able but this does not leave a lot of wiggle room for other Flows and experiments like my MondoDB integration.

Initially I set it to every two hours but even this had some troubles with the following error often appearing:

{

“status”: 429,

“message”: “This operation is rate limited by Twitter. Follow Twitter guidelines as given here: https://dev.twitter.com/rest/public/rate-limits.\r\nclientRequestId: 00776e5e-6e93-4873-bcf5-a1c972ba7d2a\r\nserviceRequestId: 597a00b83806f259127207b0a18797a0”,

“source”: “twitter-ase.azconn-ase.p.azurewebsites.net”

}

I went to the link suggested but it was broken. So I went to the rate limits in the Flow documentation for Twitter and I did not seem to be violating these limits so it was quite confusing. A little browsing revealed that others had also come across this problem and it does appear to be a bug in Flow.

image

A bit of testing suggests that as long as you do not Tweet more often than once every four hours you do not hit this error (unless you are Jukka).

Variables and the Candidate Tweets

Variables are really useful for debugging, as you can see the value assigned to them, but also for managing the information you pass around in your Flow. In my case, I defined the following variables:

  • TweetBody: The body of the Tweet we will be posting
  • TweetRank: A measure of how good the Tweet is. Initially I wanted to use ‘Likes’ but Flow does not allow you to access the number of Likes a Tweet has so I had to use another measure in the end.
  • TweetAuthor: Who Tweeted the best Tweet. While Flow does not allow you to Retweet (or put the ‘@’ symbol in any Tweet you post), I wanted to give the original poster as much credit as I could
  • TweetID: Every Tweet has a unique ID which is useful to make sure you are not posting the same popular Tweet more than once
  • TweetMatch: A flag to say if a Tweet being reviewed has failed to make the cut of being the ‘best’ Tweet

The criterion for the candidate Tweets is pretty simple.

image

If the Tweet has the #msdyn365 flag, it is worth considering. You will notice my step limits the number of Tweets returned to 100. The reason for this is because it is the maximum allowed by Flow, which is a pity.

Loop Decision One: Has the Tweet Been Retweeted?

As mentioned above, it is not possible with Flow to check the number of Likes a Tweet has so I took inspiration from Google. While much more complex now, the original algorithm for ranking in the Google search engine was the number of links to a web site. The more people referenced you, the more likely you were to appear at the top of the search rankings. In my case, I used the number of retweets of the original Tweet being referenced as my measure of popularity. To clarify, this is not the number of retweets of the Tweet that the Flow search found but, if the search found a retweet, it is the number of retweets of that original Tweet. Going to the original Tweet as my source meant I removed the possibility of Tweeting two people’s retweet of the same original Tweet, no matter how popular the retweets were..

However, I soon discovered that testing the number of retweets of the source Tweet failed if the Tweet was not a retweet. I tried working around this by capturing null results but, in the end, it was easier just to test up front.

image

You will see that if the condition fails, we set our TweetMatch flag. If there is no retweet, the Tweet is no good.

Loop Decision Two: Will My Tweet Be Too Long?

Next I want to make sure that if I construct a Tweet from this candidate Tweet, it is not too long. Initially I just concatenated the resultant Tweet but I was partially cutting hashtags and I could see that being a problem if the wrong hashtag was cut the wrong way (#MSDYN365ISAWFULLYGOOD becoming #MSDYN365ISAWFUL, for example).

image

The format of my resultant Tweet is ‘<author> <Tweet body>’ so as long as this is under 280 characters, we are good to go. Again, if this test fails, we set the TweetMatch flag.

Loop Decision Three: Testing for Popularity and Filtering Out ‘Bad’ Tweets

image

Next we ask if the Original Tweet Retweet Count is bigger than the retweet count of our existing ‘best’ Tweet. If not, we raise our flag, if it is, we need to make sure that the Tweet in question has not been Tweeted by me before and that it is not from my blacklist of Twitter Accounts.

To manage the list of posted Tweets and the blacklist, I used an Excel sheet in OneDrive. I also included myself on the blacklist as, if I did not, it could lead to the situation where I am reposting my own Tweet, which, in itself could be reposted and so on. Again, if these tests fail, the flag is set.

Final Loop Decision: Is the Tweet Worthy?

image

If the Tweet gets through all those checks unscathed, the variables are set with the values from this new Tweet. Otherwise, we reset the TweetMatch flag in readiness for the next loop integration. We then repeat for the next candidate Tweet until we have gone through all of them.

Store and Send

image

With the winning Tweet selected, we store its ID in our Excel sheet to avoid sending it twice on subsequent runs and post our Tweet. Initially, rather than using an Excel sheet, I tried string matching to avoid resends but this proved too hard with the limited tools available in Flow. Keeping a list of IDs and looping through them proved to be a lot easier to implement in the end.

As mentioned before, Flow does not allow for retweeting, so I simply constructed a Tweet which looks similar to a retweet and off it goes.

image

Consequences of  Activating the Bot

I did have one Follower complain about the bot but, otherwise things have been positive as you can see below.

image

Impressions, visits, and mentions are significantly up with followers also getting a net gain. Moreover, as well as getting more exposure, I now have an ad-free list of interesting articles to read and promote on LinkedIn.

Conclusions

This has been a really interesting project from a Flow development perspective but also in forcing me to consider what I use Twitter (and LinkedIn) for and whether I should change my use of them.

Building the bot has given me lots of tips on how non-coding developers can think like their coding counterparts, which I will be talking about in Melbourne at Focus 18 and this conscious change in my use of Twitter has massively increased my audience reach.

I encourage all of you to think about Flow can solve that automation problem you have but also, if you use social media, seriously consider if you use it as effectively as you can and if it could serve you better.

Advertisements

Setting Alerts For NightScout/MongoDB Using Zapier and Microsoft Flow

Standard

An Introduction to NightScout

As a Type 1 Diabetic, I need to monitor my blood sugar pretty much 24/7. These days I do it with a Continuous Glucose Monitor (CGM) which sits on my arm and transmits my sugar levels, every five minutes, to my phone.

Here is the sensor and transmitter on my arm; a modified Dexcom G5.

IMG_20181002_104340[1]

and here is the output using a program called xDrip+ (an open source app for Android phones).

Screenshot_20181006-011002[1]

As mentioned in my article on looping, there is a thriving online community building better ways to manage diabetes both for people like me with the disease but also, as Type 1 Diabetes often affects children, for carers to manage the disease. One of these innovations, and a key piece of technology within the looping community is NightScout: an open source web page you can build to pull your data up onto the internet. Here is the same data stream on my NightScout web page.

image

The database technology behind NightScout is called MongoDB. MongoDB is big in the open source community but not so much in the Microsoft world. In this article I will walk through how to connect to this underlying MongoDB database using Zapier and Microsoft Flow so you can set up things like PowerBI reports, alerts when your blood glucose is out of range or even have a stream of data being emailed or tweeted to someone who wants it.

image

While NightScout can be set up on Azure, I had some real problems getting it to work so I went to the other option: Heroku. The irony that I am using a SalesForce subsidiary to house my data is not lost on me. As most people set up their NightScout on Heroku, this will form the basis for my set up instructions but the principles I am showing will work just as well on an Azure-hosted NightScout site as well.

The Easy but Expensive Way: Zapier

Zapier is by far the easiest way to connect to the MongoDB data. It literally does all the work for you. Firstly, we need to sign up for a free account with Zapier. Once this is done we will want to “Make a Zap!”.

image

For the trigger, we want MongoDB and we want it to trigger when a new Document is added.

image

Basically this means the Zap! will fire every time a new entry is transmitted from my CGM, received by xDrip+ and uploaded to NightScout’s MongoDB. Next we will need to set up the connection to our MongoDB database.

image

Fortunately, the NightScout settings have everything we need. If we go to our Heroku account, select the NightScout app, select the Settings tab and ‘Reveal Config Vars’, the one we want is ‘MONGODB_URI’. This will be in the format: mongodb://<Username>:<Password>@<Host>:<Port>/<Database>.

Transfer these values across and it should just work. Next we set up the options for the MongoDB database. The collection we want is ‘entries’

image

Next it will want a sample to understand the format of the stored data. Go ahead and let it pull a record. Once this is completed, our trigger is finished. Next we specify what happens when the trigger occurs i.e. what do we want to happen when a new reading hits MongoDB.

There are a wealth of Actions to choose from but, for simplicity, I will choose “GMail – Send Email”. Again, the process is pretty simple and mirrors the setup of the trigger. The only trick to mention is clicking the icon to the right of the field if you want to reference data from the trigger. In the case of the MongoDB data, the blood glucose level is called ‘sgv’ and is stored in the US mg/dl and not mmol/l.

image

Our final steps will be to name our Zap! and activate it.

Once done, the Zap! will query MongoDB every 15 minutes and bring back the new values and send an email for each one.

image

So far everything I have described is free. xDrip+ and NightScout are free and MongoDB and Zapier also have free accounts. So why is this the expensive option? The reason is, as soon as we want to make our Zap! a little more sophisticated, we need to upgrade our Zapier account. The free account allows you to create two-step Zap!s but if you want a condition e.g. only send an email if the sgv value is greater than or less than a specific value, you need to upgrade to the US$25/month account. Fortunately, there is an alternative.

The Trickier but Cheaper Way: Microsoft Flow

Sadly, Microsoft Flow does not have a Connector for MongoDB but it does have the ‘http’ step which serves the same purpose, using the MongoDB REST API.

For Flow, the first step is to set a trigger. While the LogicApps version of the ‘http’ step has the ability to set a recurrence when the REST API is called, the Flow version does not have this so we need to set a Schedule-Recurrence trigger.

image

Just like the Zap!, we will poll every 15 minutes.

Next we set up our ‘http-http’ step. This is the tricky bit.

We are getting data so our method is GET. The URL is what we use to call the MongoDB REST API. In our case we use it to bring back our data. The format of the URI I am using for this example is:

https://api.mlab.com/api/1/databases/<database>/collections/entries?l=1&f={”sgv”:1}&s={date:-1}&apiKey=<api-key>

Thank you to Ravi Mukkelli and Olena Grischenko for figuring this part out. Full documentation for the REST API can be found here. To translate the URI, I want it to return the first record (l=1), showing just the ‘sgv’ field where the collection is sorted in descending date order.

The API KEY is available from your MongoDB account. Simply to go your Heroku account, select your NightScout app and click through to mLab MongoDB.

image

Click on the User link.

image

and your API KEY will be shown on the next page. Also remember to enable Data API Access which can be done just below where you see the API Key.

image

Your Flow should now look something like this.

image

This will get data every 15 minutes in the form of JSON. JSON is a way to represent data in a text format. Think of it like a generic and adaptable alternative to XML which is a generic and adaptable form of HTML (the thing that web pages are made of).

To make use of the data, we need to parse it (translate it into something useful). To do this we add a new step. Searching for JSON shows the Data Operations – Parse JSON step. The content is the Body from the http step. For Flow to understand the fields, it needs to get a sample of the data. To feed it this data we click the “Use sample payload to generate schema” link.

image

To get this sample, all we need to do is paste the URI from the http step into a browser. You should get something like this:

[ { “_id” : { “$oid” : “5bb8c4cd44df60074eef234e”} , “sgv” : 77} ]

Your end result should look like this.

image

Finally, we send out email via Gmail. At some point of setting up your Gmail – Send Email step, Flow will likely insert a loop. This is because, in principle, our query could have returned more than one record. As my  query forces the return of only one record, the loop will iterate once and is not really needed. It is inelegant but it will work.

Also, by default the only field that gets shown is the oid one. Therefore, you may need to click on ‘See more’ for the sgv field to show.

image

All up, this is what our basic Flow looks like.

image

The result is similar to what we got with the Zap! one.

image

I say similar because the Zap! was slightly smarter in that it returned all records created in the 15 minute interval since the last run whereas this one only retrieves the latest record.

The one big advantage Flow has is we can add more stuff. So, for example, if we want to only send an email if the sgv value is over 180, we can add that in no problem.

image

Also, it is the cheap option because while there are different plans available for Flow, the base plan is completely free Please note that on the base plan you will only be able to run the check every 30 minutes due to the monthly limit of 2,000 runs. The next plan up is US$5/per user per month which may be a better option.

Conclusions

Tools like Microsoft Flow and Zapier offer non-coders a way to address problems in ways previously out of reach. Putting these tools in the hands of people managing diabetes means the tools could literally save lives. If you are using NightScout have a play and see how you can use the technology to make your life easier.

Divinity and the ‘Citizen Developer’

Standard

It is a term of derision, a key piece of Microsoft marketing and a term I have little time for. It is ‘Citizen Developer’. In the unlikely event you are unfamiliar with this term, it is the persona used by Microsoft to describe the benefits of their Power Platform. While I could not discover who invented the term, it is one which has been strongly embraced by the Microsoft marketing engine.

The reason they have embraced it is obvious. As evidenced in some of my previous posts, it is not hard to use tools like Flow, PowerApps and Power BI to add real value to an organisation with little to no code or scripting. This opens up the way for non-coders to build powerful business applications, previously the exclusive domain of coders. The rise of the Citizen Developer.

To be honest, this is not really a new approach for Microsoft. In the past Microsoft used the term ‘Power User’ and this has always been the focus for Dynamics innovation. Certainly as far back as the product moving from v3 to v4, there was a deliberate move to give power to administrators who wanted to tinker without code. Back then I used to say you could get 70% of the way towards a production-ready application using just configuration. With each major release that percentage has increased and it is very possible to build production-ready systems these days with no appreciable code in them.

In parallel, as workflows matured and the product became more and more flexible, there was a breed of coders who grumbled that feeding the power user was a terrible idea. Limiting the power to create exclusively to coders reminds me of the days when bibles were written in Latin and the only people who could read them were the priests. The priests had exclusive access to the divine and all others had to go through them. Thankfully times have changed.

I have experienced this resistance to the power shift first hand. In the days when my blog featured cute tricks with workflows, I used to hear it a lot. Coders I spoke to would complain that processes should be handled with code; that splitting processes across different technologies meant an administrative nightmare and offered no value in the long run. Also if the power of workflows were given to end users, it would be a disaster with them going rogue and creating an unmanaged mess. We hear the same fear and uncertainty today with Flow and the fear of ‘Shadow IT’; the creation of apps and processes unsanctioned by the administrators.

I agree that things must be managed but the perspective that the solution is to keep power with the priests misses the point. When development was restricted to the few it was easy to administer, now that many can develop it is harder but it is not the wrong approach.

There are two obvious benefits to giving access to creative power to non-coders. Firstly, it frees up coders to do more interesting work. Rather than writing routines to perform dull busy-work they can cast their minds to more interesting and unique problems. Secondly, it gives non-coders a much greater appreciation of what coders do and how they go about it.

The answer to managing this new world is collaboration and discipline. Standards and conventions need to be put in place to ensure Workflows, Plugins and Scripts all work together, rather than against each other. The idea that giving non-coders power is the problem is not true. Undisciplined power is the root of the problem. I know many a Dynamics system that came undone on upgrade because a coder decided it would be easier to use unsupported methods to develop a system. Just as with the priest, the coder is human. In the new world, what separates the good and the bad ones is not exclusivity to power but their approach to their work and their ability to work with others.

Just as the role of priests in society has evolved, so too has the role of the coder in modern software development. A good coder, having experience in managing the development and release of software, has the responsibility to guide others starting out on their journey. Thankfully the way we approach software development has evolved to accommodate this new perspective. Agile has meant a greater focus on release management and DevOps and variants such as Scrum are clear that there is no ‘i’ in team. The ‘development team’ makes no distinction between those that can code and those that cannot.

The idea of coders and non-coders being equivalent in terms of software development sits much better with me than the divisive idea of ‘coders’ and ‘Citizen Developers’. While a traditional coder may well say Flow is what Citizen Developers use while ‘true’ developers embrace Logic Apps, the fact is both tools have their place in software development and many coders appreciate and gain much benefit from their use of Flow.

The fact is the relevance of coders and priests does not derive from exclusivity and while many considered the change in exclusivity tantamount to blasphemy, the experience for all is much richer today because of it.

I have faith that, over time,  we will move away from terms like ‘Citizen Developer’ and embrace the idea that anyone who builds is simply a developer and the tool used to get there is irrelevant. What is more important than the tool used is the approach used by the entire development team to deliver a robust solution. That vision of the future is something I can see myself believing in.

CRM Crime Files: LinkedIn Marketers

Standard

mugshot.linkedin

This post references my time at KPMG. I am no longer at KPMG but I have had this post in draft for a while now and thought it was time to finish it off.

A while ago I did a Crime File on ETSY and their customer service when setting up my store. This time it is LinkedIn or, more accurately, the lazy marketing companies who use it to try and generate leads for others. On two occasions I have received emails like this:

IanDorney_redacted

The KPMG article referenced was not one I had any involvement in and I think it is quite courageous to email someone who works at KPMG and suggest they have “lazy accountants”. It is my opinion that KPMG has some of the hardest working and committed accountants I have ever seen. I was also left with the feeling that, despite their suggestions otherwise, they did make this offer to anyone prepared to listen.

The article claimed to come from the Principal at an accounting company. I have removed their name and the name of their company from the above image but left on the true CRM criminals. Lead Gladiator.

Lead Gladiator offer a ‘flood’ of leads for a ‘mere’ $2,000/month, using the methods as described here.

Knowing the Principal could do better, and having some LinkedIn InMail credits to burn, I messaged him.

IanDorney3_redacted

Sadly the Principal never got back to me but Lead Gladiator did.

IanDorney5_redacted

I preferred the tone of this message but the damage had already been done. You only get one chance to make a first impression and they had failed. I replied back.

IanDorney6_redacted

I never heard from them again. The biggest issue for me in this was one of authenticity. Talking about an article I had no involvement with, telling one of the world’s largest professional services companies that they have “lazy accountants” and then suggesting the offer being made in a clumsily customized mass marketing piece is somehow exclusive, started the relationship on the wrong foot. The interaction damaged the organisation who paid Lead Gladiator more than it helped them and I doubt they got value for the large amounts of money they spent. Moreover, what does it say about your company if you only care about new customers enough to outsource your relationship with them?

My second experience started in a similar way:

kent_cameron4_redacted

Either a bot had generated the text or someone had run it through Google translate without sanity checking it with a native speaker. Whichever it was I only partially understood their intention. Again, LinkedIn InMail came to the rescue. I sent a message to the Founder.

kent_cameron1_redacted

To his credit, the Founder replied.

kent_cameron2_redacted

Good deed done for the day.

Conclusions

While it can be tempting to outsource parts of your business to ‘experts’, be very careful who you partner with. In both of these cases, the business owners’ intentions were good; to grow the business. However, in putting their faith in third parties and not being involved in the process they damaged their brand and potentially achieved the exact opposite result of what they were trying to achieve.

A business is successful when it creates real relationships with its customers and stakeholders and there is no quick way to do this. True customer relationship management is about fostering long term relationships and delivering value. If you cannot be bothered to even engage with a prospect in an authentic way, why would that prospect think you are going to deliver value when they employ you?

What Jumps Out From the October 18 Release

Standard

The release for the next major release of Dynamics is out and it is over 200 pages long. The document covers what Microsoft are going to bring out for Dynamics from October 2018 up until March 2019.

I will ignore the Finance and Operations/Business Central/Project Service/Field Service/Talent/Retail stuff (because I do not know them well enough to know what is worth getting excited about). Also, anything that is Public Preview (as this is often not available in Australia and is, essentially, a beta release) I will also skip over but anything else is fair game. Here are the things which jump out and get me excited in v10.

Dynamics 365 For Marketing

Account-Based Marketing

Microsoft released Dynamics 365 For Marketing quite early in its development. What I mean by this is there is some basic functionality there but there is plenty of room for improvement. One such improvement is Account-Based Marketing. Until now, the mass communication tools of ‘Marketing’ were for Contacts only. No Lead marketing in the Dynamics sense and no Account marketing. This now appears to have changed. This allows Microsoft to claim Marketing is for B2C and B2B scenarios, which is good. Excitement may be too strong a word to describe my feelings but it is a step in the right direction.

Social Listening For Campaigns

image

Being able to add hashtags/phrases to Campaigns and then actively monitor the online response to the Campaign from within Dynamics is great. In the Social Engagement section it reveals that this is, essentially, embedding Social Engagement into Marketing. The release does not talk about actioning the social responses but for now, we can add hashtags and see if they light up. We can measure more than the traditional email clicks and opens.

Sales

Build Intelligent Sales Applications and Business Processes Powered by LinkedIn Insight

I thought LinkedIn Sales Navigator already brought Account and Contact information into Dynamics but perhaps this is not the case. The release talks about bringing in:

  • Company data such as size, industry, and location
  • People data such as name, company, position, and years of experience
  • Icebreakers and conversation starters
  • Warm introduction connections
  • Recommendations of similar leads in an organization (presumably to circumvent blockers)

This is where the Dynamics – LinkedIn story begins to take shape. For those using LinkedIn you will have already seen enhancements since Microsoft took it over e.g. email prompts to read up on people you are meeting with. Bringing similar insights into Dynamics is a game changer in terms of efficiency. For the savvy user this information was always available. The integration brings it together and makes all users work more efficiently and effectively.

Increase Sales Conversions with Predictive Lead Scoring (Public Preview)

OK, this is in Public Preview but is probably the stand-out most exciting thing for me in the release document. Finally, the app I predicted six years ago is a reality. Well almost. Predictive Lead Scoring rates Leads in terms of their likelihood of turning into Opportunities, based on their attributes (my post from six years ago applied the idea to closing Opportunities).

This is a big deal and is a glimpse into the future of decision making. Lead scoring will tell you which Leads to focus on i.e. the ones most likely to convert. Remember Glengarry Glen Ross where Jack Lemmon laments that the stack of lead cards he is given are useless? In the modern world, Dynamics will reorder the stack to put the best on top.

No one enjoys cold calling. With predictive analytics we can put the ‘hot’ leads with the humans and spare them the cold ones by using AI or automated channels to field interest. Users become more engaged and, again, more efficient and effective (of course in the future, two of the four salesmen from Glengarry Glen Ross will be bots).

Service

Suggest Similar Cases

image

Another great example of providing available information to the user, as they need it. You can get halfway to this in v8/9 using Knowledge Articles but it still up to the user to enter keywords to search for. Using the Microsoft Text Analytics APIs automates this step.

Imagine a user who is new to the job being asked to solve a complex but common problem. They now have the wealth of experience within the organization a click away. Better service, more productive, less frustration and confusion. Everyone wins.

Dynamics 365 Portals

The release says they have overhauled the platform to make it more reliable and ‘performant’. I have experienced some of the scalability issues of the Portal firsthand so I welcome any improvement in this area.

Integration with other Microsoft services

image

In this case it is SharePoint document libraries (great if you are storing your Dynamics attachments in SharePoint) and Power BI. Apparently it will be possible to embed Power BI dashboards and reports via liquid script on the Portal web page. Power BI, in the right hands, is crazy-powerful so having this easily surfaced in Portals is very exciting.

Configuration Migration

Moving Portal configuration between environments is not simple. As configuration is held in records, it amounts to a data migration exercise. We still cannot move data via solutions (please make this happen Microsoft!! Saleslogix had this 15 years ago for goodness sake!!) so a data migration tool is needed.

One such tool is the Configuration Migration SDK. My personal experience with it has been problematic. Specifically, if two Web Form Metadata records reference the same attribute, only one comes across with the tool. Microsoft claim they have created a schema for the Configuration Migration SDK which works. I am looking forward to trying it.

PowerApps

There is a LOT in the PowerApps section. Here are the Dynamics nuggets worthy of mention.

Extend Dynamics 365 Entity Forms with Embedded Canvas Apps

This is probably second in my list of exciting things in the release. We can embed PowerApps Canvas Apps within Dynamics Forms. A big strength of Dynamics Forms is their ease of use but this comes at the price of configurability. We only have so much control on field and section layout, for example. This now changes with the ability to embed Canvas Apps.

Moreover, the Canvas App can link to anything we like via the Connectors. The key question for me is whether we can use code to pass information between the Dynamics Form and the Canvas App. If we can, this will be very powerful. Imagine having a Canvas App for adding metadata to an attachment added to SharePoint via the Dynamics forms? Alternatively, we can have a USD-like query form to use in the context of the record we are on. Perhaps we need to query SAP or an Oracle database as part of a Dynamics process but do not want to go through the trouble of bringing the data into Dynamics. We now have a quick way to make this happen.

Native Support for Common Data Service Data Types in Canvas Apps

Option Sets and GUIDs have always been tricky to manage in PowerApps. Interacting with these, took PowerApps from no-code to low-code. This has now been tidied up and the release claims “native (CDS) support for Option Sets (sic) and GUIDs and improving the time zone handling for date/time values”.

Faster Load Times with Parallel Data Loading in Canvas Apps

Tables and entities can now be loaded in parallel, rather than sequentially which has the potential to speed things up. Given one of the biggest gripes about PowerApps is the speed to load, this cannot be a bad thing.

Set Regarding Lookup Enhancements in Common Data Service

It will now be possible to filter the list of entities you can set the Regarding to. This is a big plus for users who, otherwise, have to wade through a giant list of, mostly irrelevant, entities to get to the handful they desire.

Control Availability of User Experiences on Unified Interface

As many of us know who have tinkered with the new Unified Interface (UI), there are some things missing. Here are some they are fixing in v10:

  • Advanced Find
  • Merge Records
  • Record Sharing
  • Bulk Edit
  • Run Workflow

Advanced Find, in many cases, was a show-stopper for clients moving to the new client. Fixing these up will go a long way to bringing users to the new Dynamics world.

Microsoft Flow

The release talks about the ‘citizen developer’. Personally, I detest the term ‘citizen developer’ because self-righteous coders use it as a term of derision. Either you develop (through configuration and code) or you do not. If you do, then regardless of the configuration/code mix, you should be subject to the same discipline in regards to maintenance and governance. In my mind there are simply good developers and bad developers. I know some excellent ‘citizen developers’ and some lousy coders but I digress…

Design flows in Visio

image

Flows can now be designed and published from Visio. Very cool from a system documentation perspective.

Custom Controls in Business Process Flows

image

Custom controls in Business Process Flows in Dynamics are now supported for both the UI and Web Client (yes, the Web Client too!). This means we get a lot more flexibility in the look of the Business Process Flows, making them a lot more useful.

Power BI

Not a lot to report in this section of the release other than Power BI dashboards and reports can now be embedded in Dynamics Forms, as mentioned earlier.

Data Integration

image

If you are not familiar with the new way of doing things in Dynamics, here it is and this is, by far, the biggest change to Dynamics in years. CDS for Apps is the new Dynamics platform. CRM no longer exists and has been broken into Apps (Sales, Marketing, Customer Care etc.) on top of a ‘Core’ Dynamics layer containing just the essentials e.g. Users, Teams, Accounts, Contacts, Activities etc.

image

While a bit of a simplification, Data Integration is the Flow Connectors, linking Dynamics (or, rather, CDS for Apps) to hundreds of web services. To be more accurate, the Connectors are not strictly specific to Flow; this is just where many of us have seen them. A Connector can be used in Flow, PowerApps, Logic Apps (why does this have a space and PowerApps does not!!), and Power Query (the new data querying tool being pushed in this release).

Conclusions

So there you have it. All the juicy bits that keep me excited about the future of Dynamics/PowerApps/CDS for Apps.

What really excites me is that it is clear that Microsoft continues to improve the product but, more importantly, they are being a lot smarter about it than they have in the past. Instead of developing in parallel to other areas of Microsoft, the ecosystem is coming together. There is no longer Dynamics development running parallel to Power BI development and PowerApps development; they are all part of the same platform, working together. This is a clear, unique advantage to the Microsoft offering, compared to competitors who still deliver disjointed point solution offerings. Very exciting times and a very promising future for the product/platform/ecosystem.

The Shell Game of Salesforce Reporting

Standard

shell-game-salesforce

Now that I am writing a weekly tip for the CRM Tip of the Day, I have reduced my blog cadence down to one post a month. As Salesforce reports their finances quarterly, this means literally 1/3 of my posts in a year would be about Marc’s experiment in creative accounting.

With things like Flow, PowerApps, and the new unified interface, that is simply too much bandwidth to be devoted to the software Microsoft rejected. So I will do a financial report at the end of the fourth quarter and the lemonade stand cash flow analysis but save the rest for mostly Dynamics content.

It has been a big year for Salesforce from a financials perspective. Through one-off sales and tax write-offs in the past, Salesforce had shown an artificial profit. Now they seem to be generating a real one (the lemonade stand will confirm it but that will have to be another post). The margin is not great compared to the competition and, in my opinion, the stock price is expensive for what you get, but a profit is a good foundation for a business.

What is interesting about this quarter is what has NOT been reported. Salesforce love telling the good news and hiding the bad news. This is why they hold on to Non-GAAP accounting as tightly as they do. A couple of other trends have been working against them and these have mysteriously disappeared from the financials and their web site. Let us go to the numbers.

Salesforce in the News

Salesforce Workers Urge CEO To ‘Re-Examine’ Work With Customs And Border Protection

Benioff has been petitioned from within by hundreds of Salesforce employees to cancel contracts with US Customs and Border Protection over its separation of children from their parents at US borders. This will be a true test of whether the organisation puts cultural values ahead of profit.

How MuleSoft will change the way Salesforce connects its clouds

I must admit I was confused by Salesforce’s acquisition of MuleSoft. Reading this article makes me think this is Salesforce trying to create their own version of Microsoft Flow. Time will tell.

Why Microsoft was so determined to beat Salesforce in the battle to acquire LinkedIn

An interesting take on how LinkedIn fills a gap in the Microsoft sales ecosystem. There are a lot of great technologies out there. The winner will be the one who can bring them together in a way which is intuitive for users.

Numbers of Note

No More Transactions or Staff Numbers

In the last Salesforce review I called out their transactions were shrinking.

image

I wanted to update this graph but the information is conspicuously absent from the Salesforce Trust site. My guess is the line continues to move down. Show the good news, hide the bad.

Similarly, the financial report no longer reports staff numbers. In that case there was nothing really untoward. Staff growth was around 4% per quarter, which was less than revenue growth but that was about it. Perhaps they do not want to admit they are working their staff harder?

Costs Have Jumped

image

Cost growth has rocketed from 2% to 8% per quarter, rushing past the 6% revenue growth. As we have discussed before, if costs are growing faster than revenues, profitability will decrease.

GAAP vs Non-GAAP

walnuts

Salesforce continues to embrace Non-GAAP reporting to the tune of ten mentions of Non-GAAP to three mentions of GAAP for a difference of seven. So many walnut shells, so few peas.

Buzzword Bingo

The same words again were the only ones mentioned more than ten times:

  • Customers/customer
  • Revenue
  • Cloud
  • Growth
  • Operating
  • Salesforce

Every quarter, for the last four quarters this has been the case. The speech focus never alters and, arguably, the same could be said for the content. I started Buzzword Bingo to get an idea of the focus Salesforce was putting in the quarter; where their strategy lay. The truth is Marc never alters his patter. Every quarter is great and there are no problems or, at least, none he wants to mention.

Predicting the Future

As I am only going to do this review annually, it seems fitting to try and predict the revenue and profit 12 months out. This is much harder but here we go. For the next Q4 report, I predict a revenue of around $3.8b with a 2.5% margin giving a profit of $95m.

Conclusions

In terms of the measures, not a lot changes from quarter to quarter at Salesforce these days so, in terms of interest factor, moving to annual reviews is probably a good thing. It will be interesting to see in a year’s time what data is included and omitted and this, I think, will be where the gems will be.

For this quarter, while there is something suspicious happening with transaction growth, the company appears to be genuinely profitable from a GAAP perspective although costs are accelerating. I will enjoy doing the cash analysis to get a bit more insight into this new, profitable world to see exactly where the money is coming in from: genuine sales or something else.

Triggering Flows With A Dynamics Field Change (And Calling A Flow From A Workflow)

Standard

There were many excellent sessions at Dynamics UG EMEA (CRMUG EMEA) in Dublin and, as I recently mentioned in my Tip of the Day, if there is a Dynamics Saturday or Dynamics UG event near you, they are an excellent investment of your time (and not necessarily a large investment in cash).

One question that came up during one of the sessions was how to trigger a Flow on the change of a field. One suggestion was ‘WebHooks’. Talking to the developers in my team they agreed this could work but it struck them as a fair amount of effort for just triggering a Flow. So I wondered if we could put something together with configuration.

Here is the idea I came up with.

What Can We Do Without Thinking Too Hard?

For those that have not tinkered too much with Flows, the Dynamics triggers available are all at the record level. We have:

  • OnCreate (when a record of a specific entity is created)
  • OnDelete (when a record of a specific entity is deleted)
  • OnUpdate (when a record of a specific entity is updated)

Clearly, if I am waiting for a field to change, I have the OnUpdate but this triggers on ANY field change which is inelegant but also potentially expensive when we pay for Flow by the triggering.

My Solution

Back in my Workflow Scheduler blog article, The Flow created a record in Dynamics which triggered the Workflow. This time, we are reversing that. The order of events is:

  1. A field is changed on a record (say the Est. Close Date of an Opportunity)
  2. A workflow is triggered to create a custom entity child record
  3. Flow monitors for the OnCreate of such records and then fires.

There it is. That simple. So what does it look like?

The Custom Entity

image

The custom entity has a name (which we could use as a variable for triggering different Flows) and a link back to the original record where the field change happened.

The Workflow

image

In this case, the Workflow triggers off of the change of the Est. Close Date. and creates a Flow Field Trigger record.

image

We could also use the Workflow to pass values from the Opportunity down to the Flow Field Trigger for posterity but for the purposes of the blog I will keep things simple.

The Flow

image

I have only created the trigger here as the steps could be anything you like. With the lookup to the originating Opportunity, we have access to all information as if we triggered the Flow directly. As mentioned, we can also bring values from the Opportunity down to the Flow Field Trigger record, with our Workflow, to simplify things.

Bonus Result: How To Trigger A Flow From A Workflow

The more astute of you (and those that read the title) will realise we also now have a method for triggering a Flow from a Workflow (no, v9 does not have a Workflow Step for this). Just as we triggered a scheduled Workflow by creating a record via Flow previously, here we are triggering a Flow by creating a record via a Workflow.

Conclusions

For entities where record creation or update rates are high, triggering a Flow off of the record creation/update may be problematic. While this option requires a little configuration, it gets around this problem and, as with the scheduled Workflow solution, also provides a record to track related Flow actions within Dynamics (with the option of storing the values at execution as well).