top of page
  • Writer's pictureEd G.

D365 Reports - Parameters and Filters and Pain

If you'll recall from last week, I was trying to find a way to automatically send daily HTML Reports via email from within Dynamics 365 (Sales Professional). I'm going off the rails a bit and am documenting the entire journey, including things that didn't work (for me). Even though it wasn't a solution for my particular issue, I wanted to highlight this solution because it was documented so well, and packaged so thoughtfully...and, it might help solve someone else's particular issue.

 

Go On, Git!

Like many other Business Users, I've had a tangential relationship with GitHub: Downloading some things, reading some docs, maybe even getting a step or two into their tutorials. Most of it doesn't make sense to me...yet, but it is on my list of things I'd like very much to become more fluent in. The first, most promising lead on my report-sending journey was found on GitHub, and although it hasn't been updated or maintained in a while, it got me pretty close to a solution while opening some new possibilities for later.

Sidenote:  They say the best way to learn something is to teach it.  As I'm documenting this, I'm finding things I missed in my initial pass that may have led to this solution working for me.  I'll try to blend the two experiences as we go so hopefully, we'll end up with another viable solution.

Bob Guidinger has an XRM Report Scheduler that uses Power Automate with a custom connector to render the reports and then you can do whatever you want from that point. Don't let the custom connector part scare you off, everything is very well-documented and I'll point out some things that may have changed over the last couple of years. I'll also cover some things that I missed because I wasn't familiar with the terminology/processes.

 

When All Else Fails, Read the Instructions


Most of the time, when 'reading' walk-throughs, I'll skip to the screen-shot portion and start following along. The subtle placement of the solution files at the top of the installation guide is breaking me of that habit (but not before I went to a team of experts to ask for help...they were so polite when pointing out everything I needed was in the very first step)...more on that later.

The first real update in the installation guide is the part where we are defining the custom connector inside Power Automate. It does not appear under the settings wheel (for me, anyway), but if I click on Data along the left rail, then Custom Connectors, the New Custom Connector drop-down appears at the top of the window.


Something that may be obvious to others, but wasn't to me, was that the OpenAPI file that this step is asking for is actually the Report-Renderer.swagger.json file from the downloads in Step 1. This was the thing I was looking for when I reached out to other MVP's...but I didn't make the connection between OpenAPI and "swagger". Now I know, and so do you.

 

Using Report Renderer


I really liked this solution for how well it was documented, as well as its ability to be called from within Power Automate. Next, we'll walk through how I set up my flow and what I was able to get working, and what I still can't figure out.



 

Building the Thing


Whenever I build a new flow, I typically start with a manual trigger so I can more easily control when things start, and I'll immediately follow with a Compose that I rename as "Release Notes". This gives me a place to track the changes I make as I roll them out. This is then followed by the rest of my 'declarations' or the setup I'll use later in the process. The Compose action with the days of the week is for a function I'll use in the email subject.

The Compose labeled as Parameters is the section that continues to trip me up. The documentation states that "Out of the box" reports will use FetchXML, and custom reports will depend on how the author labeled the parameters. It also defines "custom reports" as "reports [that are] created through Visual Studio/BIDS", which mine wasn't.


Below, I've detailed how I got the XML from my report, and then what I did to convert it to JSON to match the docs.


If there's already a view in Dynamics 365 that you want to use, then you're all set. I needed some grouping and sorting added, so I built a custom view to gather, prepare, and present the data in a usable format. Next, you'll want to download and install XRMToolBox. This isn't just for XRM developers or super-users anymore, there are some pretty cool things that even Makers can use. Follow the instructions to connect to your organization, and then look for the FetchXML Builder from Jonas Rapp. Using this tool, we'll be able to get the XML to use in the report.


Also, if your report doesn't already exist, you'll want to create that in D365 at this time. We won't need it for the XML, but the Report Scheduler will need that to pull in the last steps.


The epiphany I had while documenting this process was that I was trying to 'fake' parameters by using the filtered field elements into some kind of cobbled-together JSON, what I should have done was follow the example below to put the entire XML into the JSON element. Don't freak out if this is scary, I'll explain in a little more detail (as much as I can...I'm still learning, too).

{
  "CRM_FilteredSystemUser": "<fetch version=\"1.0\" output-format=\"xml-platform\" mapping=\"logical\" distinct=\"false\"><entity name=\"systemuser\"><all-attributes /></entity></fetch>"
}

The JSON above is the sample from the Report Scheduler documentation, and while you won't need to be fluent in JSON to work inside Power Automate (and Power Apps), it would be very helpful to understand the basic structure. For relevance, I'll stick with this small example:

{
  "SomeRandomKey" : "SomeRespectiveValue"
}

In the documentation, the author states that "The parameter names generally follow a format of CRM_Filtered{Entity} where {Entity} is the "Logical Name" of the entity the report is based on. I know from the XML (below) that the entity I was working with was the "activitypointer" so I thought I'd try some old fashioned brute-force find and replace.

Using the 'recipe' from the documentation, I figured my JSON would start with "CRM_Filteredactivitypointer" after the bracket, then a colon, and then my complete XML inside quotes, followed by the closing brackets.


Because the XML also uses quotes, the JSON won't know if this is the start of my XML value? or the end of my JSON value, so the document author left this clue:

Note that the FetchXML has the quotes escaped.

This would make great sense to a developer, probably, but I had to look at the patterns to discover that every quote that we wanted the JSON to ignore, we put a slash ( / ) in front of it. There are tricks around this, but I used something other than Word to do a Find & Replace since Word also liked to change the quotes to the fancy slanted ones and I didn't think JSON would like that.



Special Note from Jonas: Regarding escaping double quotes - I think this option in FetchXMLBuilder to generate FetchXML with single quotes would do it for you instead of find/replace :)



So now I've got my entire XML (with the quotes 'escaped'), and the key I want to try, so I'll put actual quotes around my edited XML, and brackets around everything to end up with this:

{
  "CRM_Filteredactivitypointer": "<fetch version=/"1.0/" output-format=/"xml-platform/" mapping=/"logical/" distinct=/"false/" >
  <entity name=/"activitypointer/" >
    <attribute name=/"activitytypecode/" />
    <attribute name=/"subject/" />
    <attribute name=/"statecode/" />
    <attribute name=/"activityid/" />
    <attribute name=/"instancetypecode/" />
    <attribute name=/"community/" />
    <attribute name=/"createdon/" />
    <attribute name=/"ownerid/" />
    <attribute name=/"description/" />
    <attribute name=/"regardingobjectidyominame/" />
    <attribute name=/"regardingobjectidname/" />
    <attribute name=/"regardingobjecttypecode/" />
    <attribute name=/"regardingobjectid/" />
    <order attribute=/"ownerid/" />
    <filter type=/"and/" >
      <condition attribute=/"createdon/" operator=/"today/" />
    </filter>
    <order attribute=/"createdon/" />
    <link-entity name=/"lead/" from=/"leadid/" to=/"regardingobjectid/" visible=/"false/" link-type=/"outer/" alias=/"a_604e26663guidmask17000d3a569919/" >
      <attribute name=/"jobtitle/" />
      <attribute name=/"companyname/" />
    </link-entity>
  </entity>
</fetch>"
}

I put this whole thing inside my Compose action that was labeled "Parameters", and then continued building the flow.

Using the Custom Connector that we've created, we'll select our desired report and format and hypothetically, we'd use the output from our Parameters Compose to produce the desired report. Spoiler alert: I still haven't gotten this part to work, more on that later.


Starting from the bottom of the Send an email step, we add the output of the Render Report step as the Attachment Content, a filename (include the file extension) and then a Body, Subject, and (most important) the "To" address.


The function for the filename is just taking the current date and formatting it in a yyyymmdd format.


And, remember that weekday Compose we had up at the top?


We use Split() to turn that into an array, and then choose the appropriate day of the week based on the number returned from the dayOfWeek() function.

split(outputs('DaysofWeek'),',')[dayOfWeek(utcNow())]

For more details on how to do this sort of thing, check out my video below


 

Persistence and Pain


Short version, I haven't been able to get this solution to work yet. I feel like I'm close, but I don't understand JSON and/or XML and/or Bob's custom connector well enough to methodically work at it. So I keep trying different angles, or maybe someone reading this knows what's going on.


The longer version is with the parameters in place I'll either get invalid JSON from Power Automate, or I'll get invalid parameters from the connector. If I take the parameters off entirely, I get the report I want, but with NO filters...so ALL THE DATA. Less than ideal.


Now that I've delivered a working solution (I'll cover that one in the next article), I can take time to work on this approach. I feel that ultimately, it is a more elegant solution but, in the field, the most elegant solution is a working solution.


If you haven't already caught my podcast interview with Mark Smith, check it out by clicking on the image below. I had a great time chatting about my journey to MVP and how I aim to make the Power Platform accessible to all levels of users.


Until next time, get in, make mistakes, learn some things, and have fun!


521 views0 comments
bottom of page