Did you ever have to read an excel file upon uploading it to Dataverse? Do you know that your flow can trigger upon updating an excel file? Do you have a complex template provided by the customer that you would like to be importing to multiple tables in the CRM recurrently?
The below article details how this can be done through Power Automate 👇
Case Scenario :
My client is using Dynamics 365 for Marketing. They would like to be able to import simply Event Registrations, and Check-ins through a single Excel File that they upload to the Event table.
Trigger : Dataverse - When a row is added, modified or deleted :
Select the update of the Dataverse file as a trigger to your flow as shown in the image below, then choose the "Download a file or an image" action in order to 'read' the file content and be able to send it to sharepoint correctly.
As you notice, the "Change type" = "Modified" and the "Column filter" = the technical name of my file field.
The "Download a file or an image" needs to be filled with the main table name, the ID of the row that triggered the flow through the file update, and the file column name.
Action : Create file in sharepoint in order to read it
The next step is to create a new file in a sharepoint site in order to read the excel content successfully. (Note that we will be deleting the file by the end of the flow run)
As you can see below, I am generating a file in Sharepoint with the Event name and adding the ".xlsx" extension by the end of the file so that it renders successfully as an excel file.
Action : Send an HTTP request to Sharepoint
There comes the tricky part of sending correctly a request to sharepoint with the right "Uri" in order to read the file that we just created.
Choose the site address, then the method and headers exactly as shown in the previous image.
To use the Excel Online connectors with dynamic file name, we need to find the "doc library id" of that file. The final Uri should look like:
_api/v2.0/drives/[Doc library id]/root:/[FILENAME.xlsx]
Step 1:
- Enter as Uri : _api/v2.0/drives and launch the flow.
- Select the "Show raw outputs" button in order to check the result of your flow run
- Copy the "DOC LIBRARY ID" in the body value as shown below as this will complete our final URL for the Uri.
Step 2:
- Edit your flow again and complete the Uri as follows then save it:
_api/v2.0/drives/[Doc library id]/root:/@{outputs('Create_file')?['body/Name']}
NB: The final part of the URL represents the name of the file created in sharepoint.
Actions : Compose - Get Tables - List rows present in a table
Add a compose action with as input :
@{body('Send_an_HTTP_request_to_SharePoint')?['id']}
Then a "Get Tables" and a "List rows present in a table" (renamed to List checkins in the image below) in order to get the rows of the Excel file in your flow.
The outpouts of the "Compose" action is included in the "File" field of both following actions.
As you notice, we're Getting tables in order to use the first (and maybe unique) captured one in the next action using the formula: first(outputs('Get_tables')?['body/value'])['id']
Actions : Apply to each - Delete file
You can now use the columns of the excel file inside the "Apply to each" row loop and apply to them your business logic.
Then, don't forget to "archive" or delete the file that was added to Sharepoint for processing purposes :
Few points to be noted:
- Excel Table: You must be using in your excel file an excel table, otherwise the flow wouldn't be able to read its content.
- In order to read the columns of your excel file, you must already have a clear clean template with prepared columns that you use while building and running the flow. Those pre-designed columns shouldn't be updated or changed.
- By default, the flow can run up to 250 line. If needed, this limit can be very simply raised to 5000. It gets quite more complex if more than 5000 rows need to be processed.
- If your Excel file contains "Date" or "Datetime" type of columns, I strongly advice to select a "DateTime format" in the "List rows present in a table" action.
Ex: DateTime format = ISO 8601.
Comments