User Review
( votes)As a Power Platform Developer working with Azure Data Factory, I have come across numerous occurrences of data flows and pipeline errors that do not always provide clear context as to why the integrations are failing, especially when Dynamics 365 tables are used as either a source or target within my integrations.
Below are some of the most common mapping errors that I have encountered while working inside of Azure Data Factory flows and pipelines, along with the recommended steps for resolving these errors. My goal is to help users understand why the error is happening, and also provide the recommend fixes to allow for quicker resolutions and easier development.
Error #1:data f365365
Error Code: Error identified in Payload provided by the user for Entity :”,
Error Message: Microsoft.OData.ODataException: Cannot convert the literal ‘ ‘ to the expected type ‘Edm.Int32’.
Cause and Recommendation:
While working on a data flow to insert and update records into D365 Dataverse, I came across this error quite a few times on the Dataverse target step. Based on this error, Data Factory is attempting to get a value from an object that is an empty string. In this case the ‘Edm.Int32’ type means you are attempting to write to an option set field in D365 Dataverse. If the value that is being passed is empty (and not null) it produces this error. It is common practice to pass a non-value within a mapping to an option set field, but Azure Data Factory requires a value of null and not an empty string.
Another quick gotcha is if you are performing a CASE or IIF expression and checking the label to set a value, you always have to declare the null value as a string within the highlighted example below:
case(isNull(Indicator), ”,
Indicator==’No’, ‘100000000’,
Indicator==’Yes’, ‘100000001’,
toString(null()))
Error #2:
Error Code: StatusCode :DFExecutorUserError
Error Message: Microsoft.OData.ODataException: An undeclared property which only has property annotations in the payload but no property value was found in the payload. In OData, only declared navigation properties and declared named streams can be represented as properties without values.
Cause and Recommendation:
Many times, when writing data into D365 Dataverse, some of the fields being updated or set are Entity reference lookup fields. These fields require exact syntax in order to get set correctly, and this error means one of the lookup fields you are attempting to map in the Dataverse sync step is not formatted with the correct API OData. In this case, the correct formatting would be table_relationship@odata.bind, or example, ‘customerid_account@odata.bind’. This format will then render the table and unique identifier correctly and allow for the lookup fields to be set.
Error #3:
Error Code: DF-Delimited-ColumnDelimiterMissed
Error Message: Column delimiter is required for parse
Cause and Recommendation:
A common task I come across with clients is the ability to intake a fixed-width text or CSV file and transform the data into a format that is broken out and integrated with a target environment such as Dataverse or Azure SQL. This error will occur if no column delimiter within the Source settings is chosen. To fix this, make sure in your CSV settings to confirm that you have the correct column delimiter chosen, which is required for parsing the data.
Error #4:
Error Code: DF-Executor-InvalidPath
Error Message: Path does not resolve to any files. Make sure the file/folder exists and is not hidden.
Cause and Recommendation:
A lot of Azure Data Factory flows I have built require a source connection into a file share such as Azure Blob Storage. Many clients want the ability to drop a file into a storage account, then have a data flow automatically pick up the file, and from there extract, transform and move it to a target system like Dynamics 365 or Azure SQL.
I have come across the above error message when the file or folder path is incorrect, or the account running the data flow does not have security rights into the storage account. Before building these data flows, you have to make sure to check the file/folder path in the source and verify that it exists and can be accessed in your storage by testing the connection within the data flow.
Error #5:
Error Code: Unexpected exception occurred and execution failed.
Error Message: During Data Flow activity execution: Hit unexpected exception and execution failed.
Cause and Recommendation:
When running pipelines and data flows, you come across errors that provide little context, such as the dreaded ‘Unexpected Exception’ error. This one can be tricky as there is nothing in the error message as to why it is failing. After some digging, the main cause of this error is found to be a back-end service error. Many times, restarting your debugging session, or simply retrying the operation, will resolve this error.
Error #6:
Error Code: No output data during debug data preview
Error Message: There are a high number of null values or missing values which may be caused by having too few rows sampled. Try updating the debug row limit and refreshing the data.
Cause and Recommendation:
When building out complex data flows, it often requires debugging to see how the data is getting transformed within each step. Sometimes when clicking ‘preview data’ it will show nothing in the preview window, or the above error, which can be frustrating. Make sure to check a couple of items: One – that the schema has been imported into the source step prior to building out the subsequent steps after, and Two – check Debug settings to make sure the number of rows in the source row limit is large enough.
Error #7:
Error Code: DF-Dynamics-InvalidNullAlternateKeyColumn
Error Message: Any column value of alternate Key can’t be NULL
Cause and Recommendation:
When Dynamics 365 is your target system within a data flow, and you are either inserting or updating data in the Dataverse tables, any field in Dataverse that has been setup as an alternate primary key needs to be included in the target mappings, and also not be NULL. The Dataverse is expecting a valid value into that alternate key column based off the rules that are setup within Dynamics 365.
Error #8:
Error Code: DF-Executor-BroadcastTimeout
Error Message: Broadcast join timeout error, you can choose ‘Off’ of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance, then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
Cause and Recommendation:
Within the transformations to produce multiple inputs, broadcasting has a default timeout of 60 seconds on debug runs, and 300 seconds on job runs. The stream chosen for broadcast is then too large to produce data within this limit.
Check the Optimize tab on your data flow transformations for join, exists, and lookup. The default option for broadcast is Auto so if Auto is set, or if you are manually setting the left or right side to broadcast under Fixed, you can either set a larger Azure integration runtime (IR) configuration or turn off broadcast. For the best performance in data flows, it is recommended that you allow Spark to broadcast by using Auto and use a memory-optimized Azure IR, Large Azure SQL Data Warehouse tables and source files are not typically good choices. In the absence of a broadcast join, use a larger cluster if this error occurs.
From experience, these have been the most common errors that I have come across within ADF Development, but with these fix recommendations, hopefully your research and troubleshooting will be a little easier to manage.
By Mike Mitchell, Senior Consultant
Working with New Dynamic
New Dynamic is a Microsoft Solutions Partner focused on the Dynamics 365 Customer Engagement and Power Platforms. Our team of dedicated professionals strives to provide first-class experiences incorporating integrity, teamwork, and a relentless commitment to our client’s success.
Contact Us today to transform your sales productivity and customer buying experiences.
The post How To Resolve Data Flows in Dynamics 365 with Azure Data Factory appeared first on CRM Software Blog | Dynamics 365.