Skip to content

Index should not rebuild when sensitivity classification changes #567

@zijchen

Description

@zijchen
  • SqlPackage or DacFx Version: 162.5.57
  • .NET Framework (Windows-only) or .NET Core: both
  • Environment (local platform and source/target platforms): all

Steps to Reproduce:

  1. Create a sqlproj with this script:
CREATE TABLE [dbo].[Table1]
(
  [Id] INT NOT NULL PRIMARY KEY,
  [Name] NVARCHAR(50) NOT NULL,
  [Price] DECIMAL(18, 2) NOT NULL,
  [Description] NVARCHAR(255) NULL
)
GO

CREATE NONCLUSTERED INDEX [IX_Table1_Name_Price] ON [dbo].[Table1] ([Name], [Price])
GO

ADD SENSITIVITY CLASSIFICATION TO [dbo].[Table1].[Price]
WITH (
    LABEL = 'Confidential',
    INFORMATION_TYPE = 'Financial',
    RANK = HIGH
);
GO
  1. Publish to a database
  2. Change the sensitivity classification above (for example: RANK = HIGH to RANK = CRITICAL)
  3. Publish/generate script to the database again. The script will contain steps to DROP INDEX, make the sensitivity classification change, before recreating the index.

Workaround
Workaround at this time is to set IgnoreSensitivityClassifications to true during publish and make the sensitivity classification changes manually in the database.

Did this occur in prior versions? If not - which version(s) did it work in?

(DacFx/SqlPackage/SSMS/Azure Data Studio)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions