Wednesday, 6 May 2026

How to Delete an SSRS Report from SQL Server Reporting Services (SSRS) in D365FO

 In Microsoft Dynamics 365 Finance & Operations, SSRS reports are deployed to the SSRS server and stored in the ReportServer database. While reports are typically managed through Visual Studio and deployment pipelines, there are situations where you may need to manually remove a report directly from the SSRS database.

This article walks through a safe and structured way to delete an SSRS report from SQL Server Reporting Services.

⚠️ Important Considerations

Before proceeding:

  • Always take a backup of the ReportServer database.
  • Avoid deleting records directly in production unless absolutely necessary.
  • Ensure the report is not required by any active processes.
  • Prefer standard deployment cleanup where possible.

Step 1: Connect to the ReportServer Database


  1. Open SQL Server Management Studio (SSMS).
  2. Connect to your SQL Server instance.
  3. Locate the ReportServer database (commonly named ReportServer or DynamicsAXReportServer).
  4. Right-click the database → New Query.

Step 2: Identify the Report in the Catalog Table

All SSRS reports are stored in the Catalog table.

Use the following query to locate your report:

SELECT * FROM Catalog WHERE Name LIKE '%<ReportName>%'

Example

SELECT *
FROM Catalog
WHERE Name LIKE '%SalesInvoice%'

This query returns:

  • Report name
  • ItemID (important for dependencies)
  • Path and metadata

Step 3: Check for Dependencies in DataSource Table

Before deleting the report, check whether it has related entries in dependent tables such as DataSource.

SELECT *

FROM DataSource AS t

WHERE t.ItemID IN ('<ItemID1>', '<ItemID2>')

The ItemID values come from the results of the Catalog query.

Step 4: Delete Dependent Records

If records exist in the DataSource table, delete them first to avoid orphaned references.

DELETE t

FROM DataSource AS t

WHERE t.ItemID IN ('<ItemID1>', '<ItemID2>')

You can also delete a single dependency:

DELETE t

FROM DataSource AS t

WHERE t.ItemID = '<ItemID>'

Step 5: Delete Report from Catalog

Once dependencies are removed, delete the report from the Catalog table.

DELETE FROM Catalog

WHERE Name IN ('ReportName_Version1', 'ReportName_Version2')

Example

DELETE FROM Catalog

WHERE Name = 'SalesInvoice'

Step 6: Validate Cleanup

Run the following again to confirm deletion:

SELECT *

FROM Catalog

WHERE Name LIKE '%SalesInvoice%'

Also re-check dependencies:

SELECT *

FROM DataSource

WHERE ItemID = '<ItemID>'

Best Practices

  • Use naming conventions to easily identify reports.
  • Avoid direct DB deletion during active deployments.
  • Use SSRS Web Portal or deployment tools when possible.
  • Document any manual changes for audit purposes.

When Should You Use This Approach?

This method is useful when:

  • A report is stuck or partially deployed
  • Duplicate reports exist
  • Deployment cleanup failed
  • You need to remove legacy artifacts quickly 

Final Thoughts

Directly modifying the SSRS ReportServer database should always be a last resort. However, when used carefully, this approach gives you precise control over report cleanup and helps resolve deployment inconsistencies in D365FO environments.

If you find yourself doing this often, it may be worth reviewing your deployment pipeline or report lifecycle management process.

Friday, 16 January 2026

Generate CAR report

 Generate CAR Report:

1. CMD as Admin

2. CD K:\AosService\PackagesLocalDirectory\bin

3. Run below command

K:\AosService\PackagesLocalDirectory\bin\xppbp.exe -packagesroot="K:\AosService\PackagesLocalDirectory" -metadata="K:\AosService\PackagesLocalDirectory" -all -model="MW" -xmlLog=C:\temp\BPCheckLogcd.xml -module="MW" -car=c:\temp\CAReport.xlsx

Sunday, 16 November 2025

Automating Credit Card Transaction File Transformation Using Azure Logic Apps and D365 Finance & Operations

 

Modern ERP environments increasingly depend on reliable, automated integration processes. After recent platform updates, certain internal transformation capabilities within Dynamics 365 Finance & Operations (D365FO) are restricted—especially those related to transforming flat‑file formats (TXT/CSV) using XSLT inside the application. 

This article presents a generic, scalable approach using Azure Logic Apps to transform incoming transaction files (TXT → XML) before they are imported into D365FO.

## ๐ŸŒ 1. Overview

Many organizations receive credit card transaction files or similar flat‑file datasets from banking partners or external systems. After system updates, internal transformation using XSLT inside D365FO may no longer be supported, prompting the need for external transformation tools.

 **Solution:** 

Use **Azure Logic Apps + Integration Account** to handle external TXT‑to‑XML transformation, store processed XML files in SharePoint (or Azure Storage), and finally import them into D365FO using Data Management (DMF).

## ๐Ÿ”„ 2. High-Level Architecture

 Below is the generic architecture flow:

```

[External System / Bank]

          |

        (TXT)

          |

      [SharePoint / Storage]

          |

   Trigger → Logic App

          |

   XSLT Mapping (Integration Account)

          |

       (XML Output)

          |

 [Processed Folder in SharePoint]

          |

      DMF Import to D365FO

## ๐Ÿ› ️ 3. End‑to‑End Process Flow

Logic App Implementation

3.1 Prerequisites / Access you need

  1. Azure subscription with permission to create resources (Resource Group, Logic App, Integration Account, Storage Account).
  2. Resource Group where you will deploy resources.
  3. Storage Account with at least one container:
    • incoming (where CSV is uploaded)
    • output (where .xslt is saved)
  4. Integration Account (Enterprise Integration) (required for Flat File decoding and for Maps/XSLT). You must be owner or contributor to create and link it.
  5. Permissions / credentials:
    • Contributor / Logic App Operator on Logic App resource.
    • Storage Blob Data Contributor or access keys (to let Logic App create/read blobs). You can use:
      • Managed Identity for the Logic App + RBAC on the Storage account (recommended), or
      • Storage account connection using storage account key / SAS.

 

The Logic App includes the following steps:

A. Create resources/ use existing

  1. Sign into Azure Portal.
  2. Create a Resource Group (if not present).
  3. Create an Integration Account:
    • Search "Integration Account" in the portal -> Create.
    • SKU: Standard (or depending on your needs).
    • Link it later to your Logic App (Standard or Consumption) in the Logic App settings.
  4. Upload map (XSLT) to Integration Account:
    • In Integration Account -> Maps -> Add -> Upload your .xslt file (choose type Map).
    • Use XSLT generated form D365FO DMF (Import project)ร Download.

B. Create the Logic App (Standard recommended)

  1. Create a Logic App (Standard):
    • Resource type: Logic App (Standard)
    • Plan: either create a new App Service environment or use default single-tenant plan.
    • On creation, in Workflow settings you can link the Integration Account that you created.
    • After creation, open the Logic App in Designer.

C. Build the workflow (Designer) — exact actions & settings

In the Logic App designer, create a Workflow and add these actions

  1. When a file is created (properties only) — Trigger on new file creation in SharePoint.

1.  {

2.    "type": "ApiConnection",

3.    "inputs": {

4.      "host": {

5.        "connection": {

6.          "referenceName": "sharepointonline-1"

7.        }

8.      },

9.      "method": "get",

10.     "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://yoursite.sharepoint.com/sites/d365/dev'))}/tables/@{encodeURIComponent(encodeURIComponent('cfcxxx696-e5a4-4c28-a2a7-60d7ff06fa1'))}/onnewfileitems",

11.     "queries": {

12.       "folderPath": "/AMEX Credit Card Transactions"

13.     }

14.   },

15.   "recurrence": {

16.     "frequency": "Minute",

17.     "interval": 2

18.   },

19.   "splitOn": "@triggerBody()?['value']"

20. }

 

  1. Initialize variable — To hold file content temporarily.
    • Name: Filecontent
    • Type: string
    • Code:

1.  {

2.    "type": "InitializeVariable",

3.    "inputs": {

4.      "variables": [

5.        {

6.          "name": "filecontent",

7.          "type": "string"

8.        }

9.      ]

10.   },

11.   "runAfter": {}

12. }

 

  1. Get file content — Retrieve the uploaded TXT file content.

1.  {

2.    "type": "ApiConnection",

3.    "inputs": {

4.      "host": {

5.        "connection": {

6.          "referenceName": "sharepointonline-1"

7.        }

8.      },

9.      "method": "get",

10.     "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://yoursite.sharepoint.com/sites/d365/dev'))}/files/@{encodeURIComponent(triggerBody()?['{Identifier}'])}/content",

11.     "queries": {

12.       "inferContentType": true

13.     }

14.   },

15.   "runAfter": {

16.     "Initialize_variable": [

17.       "Succeeded"

18.     ]

19.   }

20. }

 

  1. Set variable — Assign file content to variable.
    • Name: filecontent
    • Value: <Batch><![CDATA[
      ]]></Batch>
    • Code:

1.  {

2.    "type": "SetVariable",

3.    "inputs": {

4.      "name": "filecontent",

5.      "value": "<Batch><![CDATA[@{body('Get_file_content')}\n]]></Batch>"

6.    },

7.    "runAfter": {

8.      "Get_file_content": [

9.        "Succeeded"

10.     ]

11.   }

12. }

 

  1. Transform XML (1) — Apply the first XSLT transformation (TXT → XML).
    • Content: filecontent
    • Map Name: (select the Map(XSLT) which uploaded in step 4 of A.
    • Code:

1.  {

2.    "type": "Xslt",

3.    "inputs": {

4.      "content": "@variables('filecontent')",

5.      "integrationAccount": {

6.        "map": {

7.          "name": "ECLAMEXTXT2AX"

8.        }

9.      }

10.   },

11.   "runAfter": {

12.     "Set_variable": [

13.       "Succeeded"

14.     ]

15.   }

16. }

      1.  
  1. Transform XML (2) — Apply the second XSLT transformation (XML → FinOps XML).
    • Content: filecontent
    • Map Name: (select the Map(XSLT) which uploaded in step 4 of A.
    • Code:

1.  {

2.    "type": "Xslt",

3.    "inputs": {

4.      "content": "@variables('filecontent')",

5.      "integrationAccount": {

6.        "map": {

7.          "name": "ECLAMEXTXT2AX"

8.        }

9.      },

10.     "transformOptions": "GenerateTextOutput"

11.   },

12.   "runAfter": {

13.     "Transform_XML_1": [

14.       "Succeeded"

15.     ]

16.   }

17. }

 

  1. Create file — Store the transformed XML in the processed folder on SharePoint.

1.  {

2.    "type": "ApiConnection",

3.    "inputs": {

4.      "host": {

5.        "connection": {

6.          "referenceName": "sharepointonline-1"

7.        }

8.      },

9.      "method": "post",

10.     "body": "@body('Transform_XML')",

11.     "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://yoursite.sharepoint.com/sites/d365/dev'))}/files",

12.     "queries": {

13.       "folderPath": "/Operations/AMEX Credit Card Transactions/Processed",

14.       "name": "@{triggerBody()?['{Name}']}.xml",

15.       "queryParametersSingleEncoded": true

16.     }

17.   },

18.   "runAfter": {

19.     "Transform_XML": [

20.       "Succeeded"

21.     ]

22.   },

23.   "runtimeConfiguration": {

24.     "contentTransfer": {

25.       "transferMode": "Chunked"

26.     }

27.   }

28. }

 

A screenshot of a computer

AI-generated content may be incorrect.


How to Delete an SSRS Report from SQL Server Reporting Services (SSRS) in D365FO

 In Microsoft Dynamics 365 Finance & Operations, SSRS reports are deployed to the SSRS server and stored in the ReportServer database. W...