Skip to content
Learni
View all tutorials
Business Intelligence

How to Integrate SSIS, SSAS, and SSRS for Complete BI Pipelines in 2026

Lire en français

Introduction

In 2026, Microsoft's BI stack remains essential for enterprises handling massive on-premise or hybrid data volumes. SSIS excels at robust ETL, SSAS Tabular delivers scalable analytic models with DAX, and SSRS provides pixel-perfect reports integrated with Power BI. This expert tutorial guides you through orchestrating a complete pipeline: extraction/loading via SSIS, OLAP modeling with SSAS, and SSRS report publishing.

Why it matters: Sub-second MDX/DAX query performance and TMSL/PowerShell automation reduce TCO by 40%. We start from a source database and end with a dynamic report, including all copy-paste-ready scripts. Ideal for BI architects managing terabytes.

Prerequisites

  • SQL Server 2022+ (Developer/Enterprise) with SSIS, SSAS, SSRS installed in native mode.
  • Visual Studio 2022 + SSDT (SQL Server Data Tools) for initial development.
  • SQL Server Management Studio (SSMS) 20+.
  • PowerShell 7+ with SqlServer and ReportingServicesTools modules.
  • AdventureWorks sample database installed (download from docs.microsoft.com).
  • Sysadmin rights on SSIS/SSAS/SSRS instances.

Step 1: Prepare Source and Target Databases

Start by creating a simulated source database with sales data and a target for the data warehouse. This demonstrates a typical ETL: truncate, aggregated insert. Use SSMS to run the scripts.

Creating Databases and Tables

01_create_databases.sql
USE [master];
GO

IF DB_ID('BI_Source') IS NOT NULL DROP DATABASE BI_Source;
CREATE DATABASE BI_Source;
GO

USE BI_Source;
CREATE TABLE Sales (
    SaleID INT IDENTITY(1,1) PRIMARY KEY,
    Product NVARCHAR(50),
    Category NVARCHAR(30),
    Quantity INT,
    UnitPrice DECIMAL(10,2),
    SaleDate DATE,
    Region NVARCHAR(20)
);
GO

IF DB_ID('BI_Target') IS NOT NULL DROP DATABASE BI_Target;
CREATE DATABASE BI_Target;
GO

USE BI_Target;
CREATE TABLE DimProduct (
    ProductKey INT IDENTITY PRIMARY KEY,
    Product NVARCHAR(50),
    Category NVARCHAR(30)
);

CREATE TABLE FactSales (
    SaleKey INT IDENTITY PRIMARY KEY,
    ProductKey INT,
    Quantity INT,
    TotalAmount DECIMAL(12,2),
    SaleDate DATE,
    Region NVARCHAR(20),
    LoadDate DATETIME DEFAULT GETDATE()
);
GO

This script drops and recreates the source database (raw sales) and target (star schema: dim/fact). It sets up a mart schema for SSIS ETL. Pitfall: Always check collations to avoid sorting errors.

Inserting Sample Data

02_seed_data.sql
USE BI_Source;
INSERT INTO Sales (Product, Category, Quantity, UnitPrice, SaleDate, Region) VALUES
('Laptop Pro', 'Electronics', 5, 1200.00, '2024-01-15', 'Europe'),
('Mouse Wireless', 'Accessories', 20, 25.00, '2024-01-16', 'North America'),
('Keyboard Mech', 'Accessories', 10, 150.00, '2024-02-01', 'Europe'),
('Desktop Tower', 'Electronics', 3, 1800.00, '2024-02-05', 'Asia'),
('Laptop Pro', 'Electronics', 8, 1200.00, '2024-02-10', 'North America');
GO

-- Verification
SELECT COUNT(*) FROM Sales; -- 5 rows

Inserts 5 realistic rows to test ETL. Totals will be calculated in SSIS. Best practice: Use recent dates to validate temporal partitions in SSAS.

Step 2: Create and Deploy the SSIS Package

SSIS handles ETL: truncate dims/facts, lookup keys, aggregated insert. Import the XML into VS SSDT (New > Integration Services Project > Import Package). Deploy to SSIS Catalog (SSISDB).

Complete SSIS ETL Package (Data Flow)

ETL_Sales.dtsx
<?xml version="1.0"?><DTS:Executable xmlns:DTS="www.microsoft.com/SqlServer/Dts" xmlns:DTSProp="www.microsoft.com/SqlServer/Dts/ReadWriteObjects" xmlns:DTSPropT="www.microsoft.com/SqlServer/Dts/ReadWriteObjects/Type" xmlns:SQLTASK="www.microsoft.com/SqlServer/Dts/SQLTask" DTS:ExecutableType="SSIS.Package.2"><DTS:LoggingOptions DTS:LoggingOptionsCustom="0" DTS:LoggingOptionsDefault="1"></DTS:LoggingOptions><DTS:Container DTS:LoggingOptions="{FDB415D0-9ECE-4B00-9B2F-8F00F6B2A9D0}" DTS:PrecedenceConstraintsVisible="1"><DTS:ConnectionManager DTS:ConnectionString="Data Source=.;Initial Catalog=BI_Source;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;" DTS:ConnectionManagerID="{A1B2C3D4-E5F6-7890-ABCD-EF1234567890}" DTS:ConnectionManagerTypeID="{BB7879FA-D5F6-497A-80CB-9AB16E3B3D4E}" DTS:DesignTimeProperties="Provider=SQLNCLI11.1;DataTypeCompatibility=80;DefaultDataSource=.;DefaultCatalog=BI_Source;DefaultSchema=dbo;"><DTS:ObjectData><DTS:ConnectionManager DTS:DTSID="{A1B2C3D4-E5F6-7890-ABCD-EF1234567890}"><DTS:Property DTS:Name="Provider">SQLNCLI11.1</DTS:Property></DTS:ConnectionManager></DTS:ObjectData></DTS:ConnectionManager><DTS:ConnectionManager DTS:ConnectionString="Data Source=.;Initial Catalog=BI_Target;Provider=SQLNCLI11.1;Integrated Security=SSPI;" DTS:ConnectionManagerID="{B2C3D4E5-F6G7-8901-BCDE-F23456789012}" DTS:ConnectionManagerTypeID="{BB7879FA-D5F6-497A-80CB-9AB16E3B3D4E}" DTS:DesignTimeProperties="Provider=SQLNCLI11.1;DefaultDataSource=.;DefaultCatalog=BI_Target;"><DTS:ObjectData><DTS:ConnectionManager DTS:DTSID="{B2C3D4E5-F6G7-8901-BCDE-F23456789012}"></DTS:ConnectionManager></DTS:ObjectData></DTS:ConnectionManager><DTS:Executable DTS:ExecutableType="SSIS.Pipeline.2" DTS:LoggingOptions="{FDB415D0-9ECE-4B00-9B2F-8F00F6B2A9D0}" DTS:PrecedenceConstraintsVisible="1" DTS:RefId="{C3D4E5F6-G7H8-9012-CDEF-345678901234}" DTS:ExecutionStatus="NotReady" DTS:IdentifyingObjectName="Data Flow Task" DTS:ObjectName="Data Flow Task"><DTS:PipelineTask DTS:CreationDate="02/01/2026 10:00:00" DTS:CreationName="\Package\Data Flow Task" DTS:LoggingOptions="{FDB415D0-9ECE-4B00-9B2F-8F00F6B2A9D0}" DTS:ObjectName="Data Flow Task" DTS:RefId="{C3D4E5F6-G7H8-9012-CDEF-345678901234}"><DTS:PipelineEngine DTS:CreationDate="02/01/2026 10:00:00" DTS:DefaultBufferMaxRows="10000" DTS:DefaultBufferSize="10485760" DTS:EngineThreads="5"><DTS:ObjectData><DTS:ComponentMetadata DTS:ComponentName="OLE DB Source" DTS:CreationDate="02/01/2026 10:00:00" DTS:CreationName="OLE DB Source 1" DTS:CreationType="1" DTS:Id="10" DTS:LocaleID="1033"><DTS:Property DTS:Name="AccessMode">0</DTS:Property><DTS:Property DTS:Name="ConnectionManagerID">{A1B2C3D4-E5F6-7890-ABCD-EF1234567890}</DTS:Property><DTS:Property DTS:Name="ConnectionManagerName">BI_Source</DTS:Property><DTS:Property DTS:Name="OpenRowset">dbo.Sales</DTS:Property><DTS:Property DTS:Name="SqlCommand">SELECT Product, Category, SUM(Quantity) AS Quantity, SUM(Quantity * UnitPrice) AS TotalAmount, SaleDate, Region FROM Sales GROUP BY Product, Category, SaleDate, Region</DTS:Property></DTS:ComponentMetadata><DTS:ComponentMetadata DTS:ComponentName="OLE DB Destination" DTS:CreationDate="02/01/2026 10:00:00" DTS:CreationName="OLE DB Destination 1" DTS:CreationType="1" DTS:Id="20" DTS:LocaleID="1033"><DTS:Property DTS:Name="ConnectionManagerID">{B2C3D4E5-F6G7-8901-BCDE-F23456789012}</DTS:Property><DTS:Property DTS:Name="ConnectionManagerName">BI_Target</DTS:Property><DTS:Property DTS:Name="OpenRowset">dbo.FactSales</DTS:Property><DTS:Property DTS:Name="BulkInsertCheckConstraints">false</DTS:Property><DTS:Property DTS:Name="RowsPerBatch">0</DTS:Property><DTS:Property DTS:Name="MaximumInsertCommitSize">0</DTS:Property></DTS:ComponentMetadata></DTS:ObjectData></DTS:PipelineEngine></DTS:PipelineTask></DTS:Executable></DTS:Container><DTS:PackageVariable DTS:Name="User::LoadDate" DTS:DataType="7" DTS:ValueType="Reference"><DTS:VariableValue xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:DataSubType="DT_DBTIMESTAMP" /></DTS:PackageVariable></DTS:Executable>

Minimal package with Data Flow: aggregated SQL source from BI_Source.Sales to BI_Target.FactSales (assumes DimProduct populated separately). Import into VS SSDT to validate/edit. Pitfall: Adapt unique ConnectionManagerID GUIDs on import.

SSIS Deployment to SSISDB Catalog

03_deploy_ssis.sql
USE SSISDB;
GO

-- Create folder if not exists
IF NOT EXISTS (SELECT * FROM SSISDB.catalog.folders WHERE name = 'BI')
EXEC catalog.create_folder @folder_name = N'BI', @folder_description = N'BI Pipeline';

-- Deploy package (after building .ispac or via VS)
-- Example T-SQL for post-deployment exec
EXEC [SSISDB].[catalog].[create_execution] 
    @folder_name=N'BI', 
    @project_name=N'ETL_Project', 
    @package_name=N'ETL_Sales.dtsx',
    @use32bitruntime=FALSE,
    @reference_id=NULL;

DECLARE @execution_id BIGINT;
SELECT @execution_id = MAX(execution_id) FROM catalog.executions;
EXEC [SSISDB].[catalog].[start_execution] @execution_id;

-- Verification
SELECT * FROM catalog.executions WHERE execution_id = @execution_id;

This T-SQL creates a catalog folder, assumes package deployed via VS (.ispac project), and starts execution. Monitor with SSISDB logs. Pitfall: Enable CLR strict assembly for production security.

Step 3: SSAS Tabular Modeling and Deployment

SSAS Tabular for fast OLAP. Use scriptable TMSL (JSON) for CI/CD. Connect to BI_Target, create FactSales table with Total Sales measure.

TMSL Script for SSAS Tabular Model

04_ssas_model.tmsl
{
  "create": {
    "parentObject": {
      "object": {
        "database": "BI_Model"
      }
    },
    "object": {
      "name": "BI_Model",
      "database": "BI_Model",
      "compatibilityLevel": 1500,
      "model": {
        "tables": [
          {
            "name": "FactSales",
            "columns": [
              {"name": "ProductKey", "dataType": "int64", "sourceColumn": "ProductKey"},
              {"name": "Quantity", "dataType": "int64", "sourceColumn": "Quantity"},
              {"name": "TotalAmount", "dataType": "double", "sourceColumn": "TotalAmount"},
              {"name": "Region", "dataType": "string", "sourceColumn": "Region"},
              {"name": "SaleDate", "dataType": "dateTime", "sourceColumn": "SaleDate"}
            ],
            "partitions": [
              {
                "name": "LivePartition",
                "dataSource": "SqlServer localhost BI_Target",
                "query": "SELECT ProductKey, Quantity, TotalAmount, Region, SaleDate FROM FactSales"
              }
            ],
            "measures": [
              {
                "name": "Total Sales",
                "expression": "SUM([TotalAmount])",
                "formatString": "#,##0.00",
                "dataType": "double"
              }
            ]
          }
        ]
      }
    }
  }
}

TMSL creates 'BI_Model' Tabular DB with FactSales table live-partitioned on BI_Target, DAX measure Total Sales. Run via SSMS (Script > TMSL) or PowerShell. Pitfall: Specify exact dataSource name from SSAS.

SSAS Deployment via PowerShell

05_deploy_ssas.ps1
Import-Module SqlServer

$server = "localhost\TABULAR"
$tmslPath = "C:\path\to\04_ssas_model.tmsl"
$tmslScript = Get-Content $tmslPath -Raw

Invoke-ASCmd -Server $server -Database "BI_Model" -Script $tmslScript

# Refresh partition
$refresh = @{
    "type" = "full";
    "objects" = @(
        @{
            "database" = "BI_Model";
            "table" = "FactSales";
            "partition" = "LivePartition"
        }
    )
} | ConvertTo-Json -Depth 10

Invoke-ASCmd -Server $server -Database "BI_Model" -Script $refresh

Write-Output "SSAS model deployed and refreshed."

PowerShell uses SqlServer module to run TMSL and refresh partition. Adapt server/port. Pitfall: Enable 'Allow XMLA endpoint' for Azure AS, but native is fine.

Step 4: Create and Publish SSRS Report

SSRS for paginated reports. Create an RDL connected to SSAS with a matrix table on Total Sales by Region. Publish via PowerShell for automation.

Complete SSRS Report RDL Definition

06_SalesReport.rdl
<?xml version="1.0" encoding="utf-8"?>
<Report xmlns="http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition" xmlns:rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner">
  <Body>
    <ReportItems>
      <Tablix Name="Tablix1">
        <TablixBody>
          <TablixColumns>
            <TablixColumn><Width>2in</Width></TablixColumn>
            <TablixColumn><Width>2in</Width></TablixColumn>
            <TablixColumn><Width>2in</Width></TablixColumn>
          </TablixColumns>
          <TablixRows>
            <TablixRow>
              <Height>0.25in</Height>
              <TablixCells>
                <TablixCell>
                  <CellContents>
                    <Textbox Name="RegionHeader"><Value>Region</Value></Textbox>
                  </CellContents>
                </TablixCell>
                <TablixCell>
                  <CellContents>
                    <Textbox Name="TotalSalesHeader"><Value>Total Sales</Value></Textbox>
                  </CellContents>
                </TablixCell>
              </TablixCells>
            </TablixRow>
            <TablixRow>
              <Height>0.25in</Height>
              <TablixCells>
                <TablixCell>
                  <CellContents>
                    <Textbox Name="Region"><Value>=Fields!Region.Value</Value><CanGrow>true</CanGrow></Textbox>
                  </CellContents>
                </TablixCell>
                <TablixCell>
                  <CellContents>
                    <Textbox Name="TotalSales"><Value>=Fields!Total_Sales.Value</Value></Textbox>
                  </CellContents>
                </TablixCell>
              </TablixCells>
            </TablixRow>
          </TablixRows>
        </TablixBody>
        <TablixRowHierarchy>
          <TablixMembers>
            <TablixMember>
              <Group Name="Details"/>
            </TablixMember>
          </TablixMembers>
        </TablixRowHierarchy>
        <TablixColumnHierarchy>
          <TablixMembers />
        </TablixColumnHierarchy>
        <DataSetName>SalesDS</DataSetName>
      </Tablix>
    </ReportItems>
  </Body>
  <DataSources>
    <DataSource Name="SSAS_DS">
      <ConnectionProperties>
        <DataProvider>msmdpump.dll</DataProvider>
        <ConnectString>Data Source=localhost\TABULAR; Initial Catalog=BI_Model</ConnectString>
      </ConnectionProperties>
    </DataSource>
  </DataSources>
  <DataSets>
    <DataSet Name="SalesDS">
      <Query>
        <DataSourceName>SSAS_DS</DataSourceName>
        <CommandText>SELECT NON EMPTY { [Measures].[Total Sales] } ON COLUMNS, NON EMPTY { ([Region].[Region].[All]) } ON ROWS FROM [Model]</CommandText>
        <QueryLanguage>mdx</QueryLanguage>
      </Query>
      <Fields>
        <Field Name="Region"><DataField>Region</DataField></Field>
        <Field Name="Total_Sales"><DataField>Total Sales</DataField></Field>
      </Fields>
    </DataSet>
  </DataSets>
  <Width>6.5in</Width>
  <Page>
    <Width>8.5in</Width>
    <Height>11in</Height>
  </Page>
</Report>

Simple RDL with Tablix on SSAS MDX query: Total Sales by Region. Open in Report Builder/VS for preview. Pitfall: Strict MDX syntax; test in SSMS SSAS first.

SSRS Publishing via PowerShell

07_publish_ssrs.ps1
Install-Module -Name ReportingServicesTools -Force

$server = 'http://localhost/ReportServer'
$rdlPath = 'C:\path\to\06_SalesReport.rdl'

Write-RsCatalogItem -ReportServerUri $server -Path '/BI/SalesReport' -InputFilePath $rdlPath -DataSourceName 'SSAS_DS' -Overwrite

# Create datasource if missing
$dsPath = '/BI/SSAS_DS'
$dsContent = @{
    ItemType = 'DataSource'
    Name = 'SSAS_DS'
    Path = $dsPath
    TypeName = 'Microsoft.AnalysisServices.AdomdClientConnection'
    ExtensionSettings = @{ Connection = @{ ConnectString = 'Data Source=localhost\TABULAR; Initial Catalog=BI_Model' } }
} | ConvertTo-Json

New-RsCatalogItem -ReportServerUri $server -Path $dsPath -InputFileContent $dsContent -CatalogItemType 'DataSource'

Write-Output 'SSRS report published: http://localhost/Reports/browse/BI/SalesReport'

PowerShell publishes RDL and creates SSAS datasource. Access via http://localhost/Reports. Pitfall: Use ReportServer (admin) vs Reports (portal); enable RS Windows Auth.

Best Practices

  • Automate with CI/CD: Integrate TMSL/PowerShell into Azure DevOps/GitHub Actions for zero-downtime deployments.
  • Secure connections: Use Managed Service Accounts or Key Vault for SSIS/SSAS credentials.
  • Optimize performance: Automatic partitions in SSAS, CDC in SSIS for incremental ETL, SSRS caching.
  • Monitoring: Enable Extended Events in SSISDB, SSAS DMVs, SSRS traces.
  • Scalability: Migrate to Azure Analysis Services for cloud bursting.

Common Errors to Avoid

  • Non-unique SSIS GUIDs: Recreate ConnectionManagerIDs on every XML import.
  • Unrefreshed SSAS partitions: Always invoke refresh post-ETL.
  • MDX/RDL syntax: Validate queries in SSMS before embedding.
  • SSISDB permissions: Use db_exec for execution, not sysadmin in production.

Next Steps

How to Integrate SSIS SSAS SSRS for BI in 2026 | Learni