Quantcast
Channel: SQL Archives - SQL Authority with Pinal Dave
Viewing all 594 articles
Browse latest View live

SQL SERVER – FIX – Error: 26023 – TCP Port is Already in Use

$
0
0

One of my clients approach me that after patching of SQL instances, database mirroring stopped working. As usual, I asked them to share ERRORLOG file SQL SERVER – Where is ERRORLOG? Various Ways to Find ERRORLOG Location.  In this blog post we will learn about how to fix error 26023 TCP Port is Already in Use.

Here are the interesting messages in ERRORLOG file

2016-11-03 22:11:04.32 spid24s The Service Broker protocol transport is disabled or not configured.
2016-11-03 22:11:04.33 spid18s Database mirroring has been enabled on this instance of SQL Server.
2016-11-03 22:11:04.32 spid24s Error: 26023, Severity: 16, State: 1.
2016-11-03 22:11:04.32 spid24s Server TCP provider failed to listen on [ ‘any’ 5022]. Tcp port is already in use.
2016-11-03 22:11:04.50 spid24s Error: 9692, Severity: 16, State: 1.
2016-11-03 22:11:04.50 spid24s The Database Mirroring protocol transport cannot listen on port 5022 because it is in use by another process.

I asked them if they have multiple instances of SQL and other are also using port 5022? They told that they have other instances, but they are not using mirroring. To be doubly sure, I suggested to run from command prompt.

netstat -aon | find /I "5022"

Here is the output.

SQL SERVER - FIX - Error: 26023 - TCP Port is Already in Use tcp-in-use-01-800x164

As we can see above that port 5022 is already in use by PID 1452 (last column)

SOLUTION/WORKAROUND

We found that PID 1452 was for another instance and it was having an endpoint created on the same port but they were not using Database mirroring. I have asked them to either remove the endpoint from the other instance or if they want to use mirroring, use a different port of an endpoint.

Once we deleted the endpoint from another instance of SQL Server and restarted the endpoints here, database mirroring came back to life again.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – FIX – Error: 26023 – TCP Port is Already in Use


SQL SERVER – Error: 26014 – Unable to Load User-Specified Certificate

$
0
0

After seeing various issues with SQL Server startup problem, I felt that I know almost most of the errors, but I was wrong till someone contacted me with a new error. But I learned that Errorlog is always a good place to start with. SQL SERVER – Where is ERRORLOG? Various Ways to Find ERRORLOG LocationLet us learn in this blog post we are going to learn how to fix unable to load user-specified certificate. Here is what I saw in my client’s machine.

2016-11-03 08:55:09.64 spid9s Server name is ‘SQLSAPPROD\BILLING’. This is an informational message only. No user action is required.
2016-11-03 08:55:09.64 spid9s The NETBIOS name of the local node that is running the server is ‘SQLSAPNODE1’. This is an informational message only. No user action is required.
2016-11-03 08:55:09.64 Server Error: 26014, Severity: 16, State: 1.
2016-11-03 08:55:09.64 Server Unable to load user-specified certificate [Cert Hash(sha1) “FD757A4A777966D5EEB2BD5445D151528E47A62E”]. The server will not accept a connection. You should verify that the certificate is correctly installed. See “Configuring Certificate for Use by SSL” in Books Online.
2016-11-03 08:55:09.64 Server Error: 17182, Severity: 16, State: 1.
2016-11-03 08:55:09.64 Server TDSSNIClient initialization failed with error 0x80092004, status code 0x80. Reason: Unable to initialize SSL support. Cannot find object or property.

Above snippet of ERRORLOG has interesting message “Unable to load user-specified certificate [Cert Hash(sha1) “FD757A4A777966D5EEB2BD5445D151528E47A62E”]. The server will not accept a connection. You should verify that the certificate is correctly installed. See “Configuring Certificate for Use by SSL” in Books Online”

The certificate hash value if picked via “Certificate” registry key and once value is picked, certificate store is checked for the certificate (type, subject, thumbprint etc. would be checked)

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL<Version>.<InstanceID>\MSSQLServer\SuperSocketNetLib

Below is the registry key on my client’s computer.

SQL SERVER - Error: 26014 - Unable to Load User-Specified Certificate sql-cert-800x242

The value of <Version> would be dependent on SQL Server version:

MSSQL10 SQL Server 2008
MSSQL10_50 SQL Server 2008 R2
MSSQL11 SQL Server 2012
MSSQL12 SQL Server 2014
MSSQL13 SQL Server 2016

When I asked my client, they said there is no certificate they are using.

WORKAROUND

If you are not using a certificate, then you can go ahead and clean up the value in the registry. If you are using the certificate, then make sure it is installed correctly and it is having right thumbprint, subject etc.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Error: 26014 – Unable to Load User-Specified Certificate

SQL SERVER – Clustered Instance Online Error – SQL Server Network Interfaces: Error Locating Server/Instance Specified [xFFFFFFFF]

$
0
0

While I was playing with SQL Cluster in my lab, I restarted the VMs and found that I was not able to bring SQL Server online. As always I was looking for error message, but there was nothing interesting. Let us see in this blog post how to fix Clustered Instance Online Error.

Here were the observations:

  1. SQL ERRORLOG is getting created.
  2. If I start SQL from the services it runs fine.
  3. If I try to bring SQL resource online in the cluster, it stays for “Online Pending” and then it goes to “Failed” state

To get more about failure in the cluster, I generated cluster log using steps in my own article.

INFO [API] s_ApiGetQuorumResource final status 0.
INFO [RES] Network Name: Agent: Sending request Netname/RecheckConfig to NN:5447358a-a102-4fc9-95f4-c040e8716859:Netbios
ERR [RES] SQL Server : [sqsrvres] ODBC Error: [08001] [Microsoft][SQL Server Native Client 11.0]SQL Server Network Interfaces: Error Locating Server/Instance Specified [xFFFFFFFF]. (268435455)
ERR [RES] SQL Server : [sqsrvres] ODBC Error: [HYT00] [Microsoft][SQL Server Native Client 11.0]Login timeout expired (0)
ERR [RES] SQL Server : [sqsrvres] ODBC Error: [08001] [Microsoft][SQL Server Native Client 11.0]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online. (268435455)
INFO [RES] SQL Server : [sqsrvres] Could not connect to SQL Server (rc -1)
INFO [RES] SQL Server : [sqsrvres] SQLDisconnect returns following information
ERR [RES] SQL Server : [sqsrvres] ODBC Error: [08003] [Microsoft][ODBC Driver Manager] Connection not open (0)
INFO [RES] Network Name: Agent: Sending request Netname/RecheckConfig to NN:52cf277d-234b-4a81-a9a7-0f078fca2a17:Netbios

As per cluster logs, the cluster is not able to connect to SQL Service.

WORKAROUND / SOLUTION

Here are the normal causes of the above error:

  1. Incorrect client alias created in the configuration manager
  2. SQL Browser isn’t running when SQL is listening on a non-default port or a named instance.
  3. TCP port connection issue.

I already have detailed checklist for common causes.

SQL SERVER – FIX : ERROR : (provider: Named Pipes Provider, error: 40 – Could not open a connection to SQL Server) (Microsoft SQL Server, Error: )

In my lab, I found that I had a TCP alias created and port of SQL Server was changed after reboot, causing the SQL cluster issue.

SQL SERVER - Clustered Instance Online Error - SQL Server Network Interfaces: Error Locating Server/Instance Specified [xFFFFFFFF] sql-clus-01-800x241

To fix that forever, I changed SQL Server to listen on a static port instead of dynamic port.

Have you ever encountered same situation where the cluster log has helped you?

Reference: Pinal Dave (http://blog.SQLAuthority.com)

First appeared on SQL SERVER – Clustered Instance Online Error – SQL Server Network Interfaces: Error Locating Server/Instance Specified [xFFFFFFFF]

SQL SERVER – Error After Cluster Patching – Error: 5184, Severity: 16, State: 2

$
0
0

During my last consulting engagement, I was pulled by my client to consider an issue which they were facing. They informed that they have applied service pack on one of their clustered environment and since than SQL Server is not coming online. I asked to share ERRORLOG from the SQL instance. SQL SERVER – Where is ERRORLOG? Various Ways to Find ERRORLOG Location. Let us learn about how to fix error after cluster patching.

2016-11-20 21:09:49.44 spid9s Starting execution of PREINSTMSDB100.SQL
2016-11-20 21:09:49.44 spid9s —————————————-
2016-11-20 21:10:01.67 spid9s Error: 5184, Severity: 16, State: 2.
2016-11-20 21:10:01.67 spid9s Cannot use file ‘D:\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\temp_MS_AgentSigningCertificate_database.mdf’ for clustered server. Only formatted files on which the cluster resource of the server has a dependency can be used. Either the disk resource containing the file is not present in the cluster group or the cluster resource of the Sql Server does not have a dependency on it.
2016-11-20 21:10:01.67 spid9s Error: 1802, Severity: 16, State: 1.
2016-11-20 21:10:01.67 spid9s CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
2016-11-20 21:10:01.67 spid9s Error: 912, Severity: 21, State: 2.
2016-11-20 21:10:01.67 spid9s Script level upgrade for database ‘master’ failed because upgrade step ‘sqlagent100_msdb_upgrade.sql’ encountered error 598, state 1, severity 25. This is a serious error condition which might interfere with regular operation and the database will be taken offline. If the error happened during upgrade of the ‘master’ database, it will prevent the entire SQL Server instance from starting. Examine the previous errorlog entries for errors, take the appropriate corrective actions and re-start the database so that the script upgrade steps run to completion.
2016-11-20 21:10:01.67 spid9s Error: 3417, Severity: 21, State: 3.
2016-11-20 21:10:01.67 spid9s Cannot recover the master database. SQL Server is unable to run. Restore master from a full backup, repair it, or rebuild it. For more information about how to rebuild the master database, see SQL Server Books Online.
2016-11-20 21:10:01.67 spid9s SQL Trace was stopped due to server shutdown. Trace ID = ‘1’. This is an informational message only; no user action is required.

The start of the problem is Error: 5184, Severity: 16, State: 2.

If we look at error message is clear that the D drive is not having dependency with the SQL Server resource. We checked failover cluster manager and found below.

SQL SERVER - Error After Cluster Patching - Error: 5184, Severity: 16, State: 2 clus-sp-err-01

As we can see we have only cluster disk 4 which was E drive. We added by clicking on a highlighted area. Once we added the disk we found that issue was still not solved and SQL was not coming online. Checked ERRORLOG again and found a new problem.

2016-11-20 21:09:48.32 Logon Error: 18456, Severity: 14, State: 11.
2016-11-20 21:09:48.32 Logon Login failed for user ‘NT AUTHORITY\SYSTEM’. Reason: Token-based server access validation failed with an infrastructure error. Check for previous errors. [CLIENT: 100.168.11.171]

I asked them series of action and they informed that they have already attempted to rebuild the system databases – which was a news to me. So now the problem was that this login was not existing in SQL Server as System databases were rebuilt. Here were the steps to fix this issue.

  1. Start SQL using command prompt
NET START MSSQLSERVER /m
  1. Added ‘NT AUTHORITY\SYSTEM’ account
  2. Stopped SQL Server
NET STOP MSSQLSERVER

After this we could bring SQL Server online and issue was resolved. Have you seen a similar issue where rebuild was done in the cluster and it didn’t work?

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Error After Cluster Patching – Error: 5184, Severity: 16, State: 2

SQL SERVER – Unable to bring resource online – Error DoREPLSharedDataUpgrade : Failed to create working directory

$
0
0

A few days ago I wrote a blog about a cluster related issue where the configuration was not same on both the nodes, which was causing issues. SQL SERVER – Unable to bring resource online. Error – Data source name not found and no default driver specified. Let us learn how to fix error related to Unable to bring resource online.

After reading above article, one of my clients contacted me and said that he is seeing the same behavior but there is no issue with drivers. What else can be done? I have asked for Cluster log again and found below.

00000d64.00001098::2016/10/18-09:07:13.839 ERR [RES] SQL Server : [sqsrvres] Worker Thread (11FE840): Failed to retrieve the ftdata root registry value (hr = 2147942402, last error = 0). Full-text upgrade will be skipped.
00000d64.00001098::2016/10/18-09:07:13.839 WARN [RES] SQL Server : [sqsrvres] Worker Thread (11FE840): ReAclDirectory : Failed to apply security to H:\MSSQL12.MSSQLSERVER\MSSQL\Data (1008).
00000d64.00001098::2016/10/18-09:07:13.995 WARN [RES] SQL Server : [sqsrvres] Worker Thread (11FE840): DoREPLSharedDataUpgrade : Failed to create working directory.
00000d64.00001384::2016/10/18-09:07:13.995 ERR [RES] SQL Server : [sqsrvres] SQL Cluster shared data upgrade failed with error 0 (worker retval = 3). Please contact customer support
00000d64.00001384::2016/10/18-09:07:13.995 ERR [RES] SQL Server : [sqsrvres] Failed to prepare environment for online. See previous message for detail. Please contact customer support

It looks like there is some problem with the cluster shared data upgrade. My client informed that this happened after a service pack was installed. So we searched in the registry to see if there were any incorrect locations being pointed and found that Under [HKLM\Software\Microsoft\Microsoft SQL Server\MSSQL12.MSSQLSERVER\Replication] we noticed that WorkingDirectory pointed to the invalid drive.

SOLUTION / WORKAROUND

We went ahead and corrected incorrect path in WorkingDirectory and then tried to bring the SQL Server resource online on node and it came online without any issues.

SQL SERVER - Unable to bring resource online - Error DoREPLSharedDataUpgrade : Failed to create working directory DoREPL-01-800x205

I was able to use Process Monitor to find the key which was needed. Have you ever used this tool?

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Unable to bring resource online – Error DoREPLSharedDataUpgrade : Failed to create working directory

SQL SERVER – How to Download Microsoft OLE DB Provider for Oracle (MSDAORA) for 64 bit?

$
0
0

During my recent conversation about Oracle linked server with one of my client, I learned something new about “Microsoft OLE DB Provider for Oracle”. This blog has highlights of a lengthy conversation.

  1. MSDAORA is short form for “Microsoft OLE DB Provider for Oracle”.
  2. MSDAORA is driver provided by Microsoft to connect to Oracle database server.
  3. There is no 64-bit version provided by Microsoft for MSDAORA. Its only available for 32 bit.
  4. MSDAORA is no longer supported by current versions of Oracle. It was last updated for Oracle 9, and is no longer being updated.
  5. For 64 bit, we need to download and use the Oracle client and the provider from their site for connecting with SQL. Oracle has a 64-bit version that is usable for us.
  6. Oracle is a provider which is supported by Oracle.

Here is a snip from one of Microsoft connect item.

If you are using Oracle data sources, you should migrate to the Oracle-supplied provider and driver. Microsoft OLEDB Provider for Oracle (msdaora.dll) and Microsoft ODBC driver for Oracle (msorcl32.dll) are built by using Oracle Call Interface (OCI) version 7. Oracle no longer supports applications that use OCI version 7 calls, and these technologies are deprecated.

Now you may ask, how do I get 32-bit version? Well, it’s already available as a part of the operating system. You can do that by creating a UDL file as suggested in the link and opening it via 32-bit command prompt. Hopefully you know that there are two cmd.exe on 64-bit machine, one is under C:\Windows\System32 and another is under C:\Windows\SysWOW64.

I have created a UDL file and opened via both commands prompts.

SQL SERVER - How to Download Microsoft OLE DB Provider for Oracle (MSDAORA) for 64 bit? msdaora-01

SQL SERVER - How to Download Microsoft OLE DB Provider for Oracle (MSDAORA) for 64 bit? msdaora-02

As we can see above, we can see provider under 32 bit but not 64 bit.

Hopefully this blog would help you in getting the common question which I found is very difficult to search and get answer.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

First appeared on SQL SERVER – How to Download Microsoft OLE DB Provider for Oracle (MSDAORA) for 64 bit?

SQL SERVER – Testing Database Performance with tSQLt and SQLQueryStress

$
0
0

I guess, it is no secret that testing plays a critical part in development of any software product. The more rigorous testing is, the better the final product will be. In this blog post we will be Testing Database Performance with tSQLt and SQLQueryStress.

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt1

There is a common practice whenthe software code testing is carried out quite meticulously, while the database testing is either skipped because of the lack of time, or carried out on leftovers. To be quite honest, things are even worse in real life: database is recalled only when there are actual problems with it already. Eventually, the work with database may become a real bottleneck in performance of your application.

To get rid of such problems, I suggest looking into various aspects of database testing, including load testing and performance testing of SQL Server in general by means of unit tests.

Let’s consider an abstract task. For instance, let’s imagine we are developing an engine for online shops. Our customers may have different volume of sales, product types… but to make it more straightforward, we will make the database structure maximally simple.

Database Structure

USE [master]
GO

IF DB_ID('db_sales') IS NOT NULL BEGIN
    ALTER DATABASE [db_sales] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    DROP DATABASE [db_sales]
END
GO

CREATE DATABASE [db_sales]
GO

USE [db_sales]
GO

CREATE TABLE dbo.Customers (
      [CustomerID] INT IDENTITY PRIMARY KEY
    , [FullName] NVARCHAR(150)
    , [Email] VARCHAR(50) NOT NULL
    , [Phone] VARCHAR(50)
)
GO

CREATE TABLE dbo.Products (
      [ProductID] INT IDENTITY PRIMARY KEY
    , [Name] NVARCHAR(150) NOT NULL
    , [Price] MONEY NOT NULL CHECK (Price &gt; 0)
    , [Image] VARBINARY(MAX) NULL
    , [Description] NVARCHAR(MAX)
)
GO

CREATE TABLE dbo.Orders (
      [OrderID] INT IDENTITY PRIMARY KEY
    , [CustomerID] INT NOT NULL
    , [OrderDate] DATETIME NOT NULL DEFAULT GETDATE()
    , [CustomerNotes] NVARCHAR(MAX)
    , [IsProcessed] BIT NOT NULL DEFAULT 0
)
GO

ALTER TABLE dbo.Orders WITH NOCHECK
    ADD CONSTRAINT FK_Orders_CustomerID FOREIGN KEY (CustomerID)
    REFERENCES dbo.Customers (CustomerID)
GO

ALTER TABLE dbo.Orders CHECK CONSTRAINT FK_Orders_CustomerID
GO

CREATE TABLE dbo.OrderDetails
(
      [OrderID] INT NOT NULL
    , [ProductID] INT NOT NULL
    , [Quantity] INT NOT NULL CHECK (Quantity &gt; 0)
    , PRIMARY KEY (OrderID, ProductID)
)
GO

ALTER TABLE dbo.OrderDetails WITH NOCHECK
    ADD CONSTRAINT FK_OrderDetails_OrderID FOREIGN KEY (OrderID)
    REFERENCES dbo.Orders (OrderID)
GO

ALTER TABLE dbo.OrderDetails CHECK CONSTRAINT FK_OrderDetails_OrderID
GO

ALTER TABLE dbo.OrderDetails WITH NOCHECK
    ADD CONSTRAINT FK_OrderDetails_ProductID FOREIGN KEY (ProductID)
    REFERENCES dbo.Products (ProductID)
GO

ALTER TABLE dbo.OrderDetails CHECK CONSTRAINT FK_OrderDetails_ProductID
GO

Let’s also suggest that our sophisticated client will work with the database with help of the preliminary written stored procedures. They all are pretty simple. The first procedure is for insertion of a new user or for getting ID of the existing one:

CREATE PROCEDURE dbo.GetCustomerID
(
      @FullName NVARCHAR(150)
    , @Email VARCHAR(50)
    , @Phone VARCHAR(50)
    , @CustomerID INT OUT
)
AS BEGIN

    SET NOCOUNT ON;

    SELECT @CustomerID = CustomerID
    FROM dbo.Customers
    WHERE Email = @Email

    IF @CustomerID IS NULL BEGIN

        INSERT INTO dbo.Customers (FullName, Email, Phone)
        VALUES (@FullName, @Email, @Phone)

        SET @CustomerID = SCOPE_IDENTITY()

    END

END

The second procedure is for placing a new order:

CREATE PROCEDURE dbo.CreateOrder
(
      @CustomerID INT
    , @CustomerNotes NVARCHAR(MAX)
    , @Products XML
)
AS BEGIN

    SET NOCOUNT ON;

    DECLARE @OrderID INT

    INSERT INTO dbo.Orders (CustomerID, CustomerNotes)
    VALUES (@CustomerID, @CustomerNotes)

    SET @OrderID = SCOPE_IDENTITY()

    INSERT INTO dbo.OrderDetails (OrderID, ProductID, Quantity)
    SELECT @OrderID
         , t.c.value('@ProductID', 'INT')
         , t.c.value('@Quantity', 'INT')
    FROM @Products.nodes('items/item') t(c)

END

Suppose, we need to ensure the minimal call during execution of queries. There will be hardly any performance issues with an empty database. Therefore, we need at least some data to test performance of our database. For this, let’s use the following script to generate test data for the Customers table:

DECLARE @obj INT = OBJECT_ID('dbo.Customers')
      , @sql NVARCHAR(MAX)
      , @cnt INT = 10

;WITH
    E1(N) AS (
        SELECT * FROM (
            VALUES
                (1),(1),(1),(1),(1),
                (1),(1),(1),(1),(1)
        ) t(N)
    ),
    E2(N) AS (SELECT 1 FROM E1 a, E1 b),
    E4(N) AS (SELECT 1 FROM E2 a, E2 b),
    E8(N) AS (SELECT 1 FROM E4 a, E4 b)
SELECT @sql = '
DELETE FROM ' + QUOTENAME(OBJECT_SCHEMA_NAME(@obj))
    + '.' + QUOTENAME(OBJECT_NAME(@obj)) + '

;WITH
    E1(N) AS (
        SELECT * FROM (
            VALUES
                (1),(1),(1),(1),(1),
                (1),(1),(1),(1),(1)
        ) t(N)
    ),
    E2(N) AS (SELECT 1 FROM E1 a, E1 b),
    E4(N) AS (SELECT 1 FROM E2 a, E2 b),
    E8(N) AS (SELECT 1 FROM E4 a, E4 b)
INSERT INTO ' + QUOTENAME(OBJECT_SCHEMA_NAME(@obj))
    + '.' + QUOTENAME(OBJECT_NAME(@obj)) + '(' +
    STUFF((
        SELECT ', ' + QUOTENAME(name)
        FROM sys.columns c
        WHERE c.[object_id] = @obj
            AND c.is_identity = 0
            AND c.is_computed = 0
            FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, '')
+ ')
SELECT TOP(' + CAST(@cnt AS VARCHAR(10)) + ') ' +
STUFF((
        SELECT '
    , ' + QUOTENAME(name) + ' = ' +
        CASE
            WHEN TYPE_NAME(c.system_type_id) IN (
                        'varchar', 'char', 'nvarchar',
                        'nchar', 'ntext', 'text'
                )
                THEN (
                    STUFF((
                        SELECT TOP(
                                CASE WHEN max_length = -1
                                    THEN CAST(RAND() * 10000 AS INT)
                                    ELSE max_length
                                END
                            /
                                CASE WHEN TYPE_NAME(c.system_type_id) IN ('nvarchar', 'nchar', 'ntext')
                                    THEN 2
                                    ELSE 1
                                END
                        ) '+SUBSTRING(x, (ABS(CHECKSUM(NEWID())) % 80) + 1, 1)'
                        FROM E8
                        FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 1, '')
                )
            WHEN TYPE_NAME(c.system_type_id) = 'tinyint'
                THEN '50 + CRYPT_GEN_RANDOM(10) % 50'
            WHEN TYPE_NAME(c.system_type_id) IN ('int', 'bigint', 'smallint')
                THEN 'CRYPT_GEN_RANDOM(10) % 25000'
            WHEN TYPE_NAME(c.system_type_id) = 'uniqueidentifier'
                THEN 'NEWID()'
            WHEN TYPE_NAME(c.system_type_id) IN ('decimal', 'float', 'money', 'smallmoney')
                THEN 'ABS(CAST(NEWID() AS BINARY(6)) % 1000) * RAND()'
            WHEN TYPE_NAME(c.system_type_id) IN ('datetime', 'smalldatetime', 'datetime2')
                THEN 'DATEADD(MINUTE, RAND(CHECKSUM(NEWID()))
                      *
                      (1 + DATEDIFF(MINUTE, ''20000101'', GETDATE())), ''20000101'')'
            WHEN TYPE_NAME(c.system_type_id) = 'bit'
                THEN 'ABS(CHECKSUM(NEWID())) % 2'
            WHEN TYPE_NAME(c.system_type_id) IN ('varbinary', 'image', 'binary')
                THEN 'CRYPT_GEN_RANDOM(5)'
            ELSE 'NULL'
        END
    FROM sys.columns c
    WHERE c.[object_id] = @obj
        AND c.is_identity = 0
        AND c.is_computed = 0
        FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 8, '
      ')
 + '
FROM E8
CROSS APPLY (
    SELECT x = ''0123456789-ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz''
) t'

EXEC sys.sp_executesql @sql

The script was created for the purpose of generation of random test data for unstructured tables. As a result of this approach, we succeed in terms of versatility, but failed in terms of reality:

CustomerID FullName Email Phone
———– ———————————— —————- —————
1 uN9UiFZ9i0pALwQXIfC628Ecw35VX9L i6D0FNBuKo9I ZStNRH8t1As2S
2 Jdi6M0BqxhE-7NEvC1 a12 UTjK28OSpTHx 7DW2HEv0WtGN
3 0UjI9pIHoyeeCEGHHT6qa2 2hUpYxc vN mqLlO 7c R5 U3ha
4 RMH-8DKAmewi2WdrvvHLh w-FIa wrb uH
5 h76Zs-cAtdIpw0eewYoWcY2toIo g5pDTiTP1Tx qBzJw8Wqn
6 jGLexkEY28Qd-OmBoP8gn5OTc FESwE l CkgomDyhKXG
7 09X6HTDYzl6ydcdrYonCAn6qyumq9 EpCkxI01tMHcp eOh7IFh
8 LGdGeF5YuTcn2XkqXT-92 cxzqJ4Y cFZ8yfEkr
9 7 Ri5J30ZtyWBOiUaxf7MbEKqWSWEvym7 0C-A7 R74Yc KDRJXX hw
10 D DzeE1AxUHAX1Bv3eglY QsZdCzPN0 RU-0zVGmU

Of course, nobody stops us from wiring a script for generating more realistic data for the same Customers table:

DECLARE @cnt INT = 10

DELETE FROM dbo.Customers

;WITH
    E1(N) AS (
        SELECT * FROM (
            VALUES
                (1),(1),(1),(1),(1),
                (1),(1),(1),(1),(1)
        ) t(N)
    ),
    E2(N) AS (SELECT 1 FROM E1 a, E1 b),
    E4(N) AS (SELECT 1 FROM E2 a, E2 b),
    E8(N) AS (SELECT 1 FROM E4 a, E4 b)
INSERT INTO dbo.Customers (FullName, Email, Phone)
SELECT TOP(@cnt)
      [FullName] = txt
    , [Email] = LOWER(txt) + LEFT(ABS(CHECKSUM(NEWID())), 3) + '@gmail.com'
    , [Phone] =
        '+38 (' + LEFT(ABS(CHECKSUM(NEWID())), 3) + ') ' +
            STUFF(STUFF(LEFT(ABS(CHECKSUM(NEWID())), 9)
                , 4, 1, '-')
                    , 7, 1, '-')
FROM E8
CROSS APPLY (
    SELECT TOP(CAST(RAND(N) * 10 AS INT)) txt
    FROM (
        VALUES
            (N'Boris_the_Blade'),
            (N'John'), (N'Steve'),
            (N'Mike'), (N'Phil'),
            (N'Sarah'), (N'Ann'),
            (N'Andrey'), (N'Liz'),
            (N'Stephanie')
    ) t(txt)
    ORDER BY NEWID()
) t

Data has become more realistic:


FullName Email Phone
————— ————————– ——————-
Boris_the_Blade boris_the_blade1@gmail.com +38 (146) 296-33-10
John john130@mail.com +38 (882) 688-98-59
Phil phil155@gmail.com +38 (125) 451-73-71
Mike mike188@gmail.com +38 (111) 169-59-14
Sarah sarah144@gmail.com +38 (723) 124-50-60
Andrey andrey100@gmail.com +38 (193) 160-91-48
Stephanie stephanie188@gmail.com +38 (590) 128-86-02
John john723@gmail.com +38 (194) 101-06-65
Phil phil695@gmail.com +38 (164) 180-57-37
Mike mike200@gmail.com +38 (110) 131-89-45

However, we should not forget that we have foreign keys between tables, and generation of consistent data for the rest of instances is a way harder task. To avoid inventing a solution for this problem, I suggest using dbForge Data Generator for SQL Server that allows to generate meaningful test data for database tables.

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt2

SELECT TOP 10 *
FROM dbo.Customers
ORDER BY NEWID()


CustomerID FullName Email Phone
———– ————– ———————————– —————–
18319 Noe Pridgen Doyle@example.com (682) 219-7793
8797 Ligia Gaddy CrandallR9@nowhere.com (623) 144-6165
14712 Marry Almond Cloutier39@nowhere.com (601) 807-2247
8280 NULL Lawrence_Z_Mortensen85@nowhere.com (710) 442-3219
8012 Noah Tyler RickieHoman867@example.com (944) 032-0834
15355 Fonda Heard AlfonsoGarcia@example.com (416) 311-5605
10715 Colby Boyd Iola_Daily@example.com (718) 164-1227
14937 Carmen Benson Dennison471@nowhere.com (870) 106-6468
13059 Tracy Cornett DaniloBills@example.com (771) 946-5249
7092 Jon Conaway Joey.Redman844@example.com (623) 140-7543

Test data is ready. Now, let’s proceed to testing productivity of our stored procedures.

We have the GetCustomerID procedure that returns a client ID. If the ID does not exist, the procedure creates the corresponding record in the Customers table. Let’s try to execute it with the preliminary enabled view of the actual execution plan:

DECLARE @CustomerID INT
EXEC dbo.GetCustomerID @FullName = N'Сергей'
                     , @Email = 'sergeys@mail.ru'
                     , @Phone = '7105445'
                     , @CustomerID = @CustomerID OUT
SELECT @CustomerID

The plan shows that the full scan of the clustered index takes place:

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt3

To achieve this, SQL Server has to make 200 logical reads from the table and it takes approximately 20 milliseconds:


Table ‘Customers’. Scan count 1, logical reads 200, physical reads 0, …
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 20 ms.

And here we are talking about execution of one single query. What if our stored procedure will be executed very actively? The constant index scanning will reduce the server productivity.

Let’s try to execute the stress testing of our stored procedure with one interesting open-source tool, SQLQueryStress (link to GitHub).

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt4

We can see that the 2 thousand calls of the GetCustomerID procedure in two streams took a little less than 4 seconds. Now, let’s see what will happen if we add an index to the field that is involved in our search:

CREATE NONCLUSTERED INDEX IX_Email ON dbo.Customers (Email)

The execution plan shows that Index Scan has been replaced with Index Seek:

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt5

The logical reads and total execution time have been reduced:


Table ‘Customers’. Scan count 1, logical reads 2, …
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 8 ms.

If we repeat our stress test in SQLQueryStress, we will see that our procedure loads server less and is executed faster during the multiple call:

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt6

Now, let’s try to emulate the mass order placement with SQLQueryStress:

DECLARE @CustomerID INT
      , @CustomerNotes NVARCHAR(MAX)
      , @Products XML

SELECT TOP(1) @CustomerID = CustomerID
            , @CustomerNotes = REPLICATE('a', RAND() * 100)
FROM dbo.Customers
ORDER BY NEWID()

SELECT @Products = (
    SELECT [@ProductID] = ProductID
         , [@Quantity] = CAST(RAND() * 10 AS INT)
    FROM dbo.Products
    ORDER BY ProductID
        OFFSET CAST(RAND() * 1000 AS INT) ROWS
        FETCH NEXT CAST(RAND() * 10 AS INT) + 1 ROWS ONLY
    FOR XML PATH('item'), ROOT('items')
)

EXEC dbo.CreateOrder @CustomerID = @CustomerID
                   , @CustomerNotes = @CustomerNotes
                   , @Products = @Products

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt7

100 executions of the procedure in two streams simultaneously took 2.5 seconds. Let’s clear the waits statistics:

DBCC SQLPERF("sys.dm_os_wait_stats", CLEAR)

We can see that the first place is occupied by WRITELOG, the wait time of which approximately corresponds to the total execution time of our stress test. What does this wait mean? Since each insert command is atomic, the physical fixation of changes in log takes place after its execution. When there are lots of short transactions, the queue appears, because log operations take place simultaneously, unlike data files.

In SQL Server 2014, the ability to setup writeback to the Delayed Durability log was introduced, which can be enabled on the database level:

ALTER DATABASE db_sales SET DELAYED_DURABILITY = ALLOWED

Next, we need to just alter the stored procedure:

ALTER PROCEDURE dbo.CreateOrder
(
      @CustomerID INT
    , @CustomerNotes NVARCHAR(MAX)
    , @Products XML
)
AS BEGIN

    SET NOCOUNT ON;

    BEGIN TRANSACTION t

        DECLARE @OrderID INT

        INSERT INTO dbo.Orders (CustomerID, CustomerNotes)
        VALUES (@CustomerID, @CustomerNotes)

        SET @OrderID = SCOPE_IDENTITY()

        INSERT INTO dbo.OrderDetails (OrderID, ProductID, Quantity)
        SELECT @OrderID
             , t.c.value('@ProductID', 'INT')
             , t.c.value('@Quantity', 'INT')
        FROM @Products.nodes('items/item') t(c)

    COMMIT TRANSACTION t WITH (DELAYED_DURABILITY = ON)

END

Let’s clear statistics and run stress test for the second time:

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt8

The total execution time has been reduced in two times, and the WRITELOG waits have become minimal:


wait_type wait_time
————————– ———-
PREEMPTIVE_OS_WRITEFILE 0.027000
PAGEIOLATCH_EX 0.024000
PAGELATCH_EX 0.020000
WRITELOG 0.014000

Let’s take a look at another situation, when periodic check of the execution productivity of one or another query is required. SQLQueryStress is not that convenient tool for this goal, since we have to open the app, copy the query and wait for its execution.

EXEC sys.sp_configure 'clr enabled', 1
RECONFIGURE
GO

ALTER DATABASE [db_sales] SET TRUSTWORTHY ON
GO

Can we automate this somehow?

In 2014, I tried out tSQLt for the first time, which turned out to be quite a nice free framework for unit testing. Let’s try to install tSQLt and create an autotest for checking the productivity of our stored procedure.

Let’s download the latest version of tSQLt and set up an SQL Server instance for the work with CLR:

After this, let’s execute the tSQLt.class.sql script from the archive against our database. The script creates its own tSQLt schema, CLR assembly and lots of script objects. A part of the procedures which are supposed for internal framework usage will contain the Private_ prefix.

If everything has been installed successfully, we’ll see the following message in the Output:

+—————————————–+
| |
| Thank you for using tSQLt. |
| |
| tSQLt Version: 1.0.5873.27393 |
| |
+—————————————–+

Now, let’s create a schema in which we will create autotests:

USE [db_sales]
GO

CREATE SCHEMA [Performance]
GO

EXEC sys.sp_addextendedproperty @name = N'tSQLt.Performance'
                              , @value = 1
                              , @level0type = N'SCHEMA'
                              , @level0name = N'Performance'
GO

Note, that Extended Property defines the relation of one or another object to the tSQLt functionality.

Let’s create a test in the Performance schema and specify the test prefix in its name:

CREATE PROCEDURE [Performance].[test ProcTimeExecution]
AS BEGIN

    SET NOCOUNT ON;

    EXEC tSQLt.Fail 'TODO: Implement this test.'

END

Let’s try to execute the created autotest. For this, we can either execute

EXEC tSQLt.RunAll

…or specify schema explicitly:

EXEC tSQLt.Run 'Performance'

, or a certain test:

EXEC tSQLt.Run 'Performance.test ProcTimeExecution'

If it is required to tun the lately executed test, we can call Run without parameters:

EXEC tSQLt.Run

After execution of one of the above commands, we will get the following information:

[Performance].[test ProcTimeExecution] failed: (Failure) TODO: Implement this test.
+———————-+
|Test Execution Summary|
+———————-+

|No|Test Case Name |Dur(ms)|Result |
+–+————————————–+——-+——-+
|1 |[Performance].[test ProcTimeExecution]| 0|Failure|

Let’s try to modify the contents of the autotest to something useful. For instance, let’s take the GetUnprocessedOrders procedure that returns the list of unprocessed orders:

CREATE PROCEDURE dbo.GetUnprocessedOrders
AS BEGIN

    SET NOCOUNT ON;

    SELECT
          o.OrderID
        , o.OrderDate
        , c.FullName
        , c.Email
        , c.Phone
        , OrderSum = (
                SELECT SUM(p.Price + d.Quantity)
                FROM dbo.OrderDetails d
                JOIN dbo.Products p ON d.ProductID = p.ProductID
                WHERE d.OrderID = o.OrderID
            )
    FROM dbo.Orders o
    JOIN dbo.Customers c ON o.CustomerID = c.CustomerID
    WHERE o.IsProcessed = 0

END

…and create an autotest that will execute the procedure for a certain number of times and will end with error, if the average execution time is more than the specified threshold value.

ALTER PROCEDURE [Performance].[test ProcTimeExecution]
AS BEGIN

    SET NOCOUNT ON;
    DECLARE @time DATETIME
          , @duration BIGINT = 0
          , @cnt TINYINT = 10

    WHILE @cnt &gt; 0 BEGIN

        SET @time = GETDATE()
        EXEC dbo.GetUnprocessedOrders
        SET @duration += DATEDIFF(MILLISECOND, @time, GETDATE())

        SET @cnt -= 1
    END

    IF @duration / 10 &gt; 100 BEGIN
        
        DECLARE @txt NVARCHAR(MAX) = 'High average execution time: '
            + CAST(@duration / 10 AS NVARCHAR(10)) + ' ms'

        EXEC tSQLt.Fail @txt

    END

END

Let’s execute the autotest:

EXEC tSQLt.Run 'Performance'

We receive the following message:


[Performance].[test ProcTimeExecution] failed: (Error) High execution time: 161 ms

+———————-+
|Test Execution Summary|
+———————-+

|No|Test Case Name |D ur(ms)|Result|
+–+————————————–+——-+——+
|1 |[Performance].[test ProcTimeExecution]| 1620|Error |

Let’s try to optimize the query to make the test pass. First, let’s take a look at the execution plan:

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt9

As we see, the problem is in the frequent call to the clustered index of the Products table. A large number or logical reads also confirms this statement:


Table ‘Customers’. Scan count 1, logical reads 200, …
Table ‘Orders’. Scan count 1, logical reads 3886, …
Table ‘Products’. Scan count 0, logical reads 73607, …
Table ‘OrderDetails’. Scan count 1, logical reads 235, …

How can we fix the situation? Well, we can either add a non-clusted index, include the Price field to it, make a pre-estimate of values in a separate table. Alternatively, we can create an aggregate index view:

CREATE VIEW dbo.vwOrderSum
WITH SCHEMABINDING
AS
    SELECT d.OrderID
         , OrderSum = SUM(p.Price + d.Quantity)
         , OrderCount = COUNT_BIG(*)
    FROM dbo.OrderDetails d
    JOIN dbo.Products p ON d.ProductID = p.ProductID
    GROUP BY d.OrderID
GO

CREATE UNIQUE CLUSTERED INDEX IX_OrderSum
    ON dbo.vwOrderSum (OrderID)
...and modify the procedure:
ALTER PROCEDURE dbo.GetUnprocessedOrders
AS BEGIN

    SET NOCOUNT ON;

    SELECT
          o.OrderID
        , o.OrderDate
        , c.FullName
        , c.Email
        , c.Phone
        , s.OrderSum
    FROM dbo.Orders o
    JOIN dbo.Customers c ON o.CustomerID = c.CustomerID
    JOIN dbo.vwOrderSum s WITH(NOEXPAND) ON o.OrderID = s.OrderID
    WHERE o.IsProcessed = 0

END

It’s better to specify the NOEXPAND hint to make the optimizer execute the index from our view. Besides, we can create a new filtered index to minimize the quantity of logical reads from Orders:

CREATE NONCLUSTERED INDEX IX_UnProcessedOrders
    ON dbo.Orders (OrderID, CustomerID, OrderDate)
    WHERE IsProcessed = 0

Now, a more simple plan is used during execution of our stored procedure:

SQL SERVER - Testing Database Performance with tSQLt and SQLQueryStress tsqlt10

The number of logical reads has decreased as well:


Table ‘Customers’. Scan count 1, logical reads 200, …
Table ‘Orders’. Scan count 1, logical reads 21, …
Table ‘vwOrderSum’. Scan count 1, logical reads 44, …

Execution of the stored procedure has been also minimized, and our test will be executed successfully:


|No|Test Case Name |Dur(ms)|Result |
+–+————————————–+——-+——-+
|1 |[Performance].[test ProcTimeExecution]| 860|Success|

We can hug ourselves. We have optimized all bottlenecks and made a really cool product. But let’s be sincere with ourselves. Data tend to mount up, and SQL Server generates execution plan on the basis of the expected number of lines. We have performed testing with perspective for the future, but there’ s no guarantee that somebody won’t delete the required index and so on. That’s why, it is highly important to run similar autotests on a regular basis in order to influence problems timely.

Now, let’s see what else we can do with unit tests.

For instance, we can check all execution plans for the MissingIndexGroup section. And if the section exists, SQL Server considers that a certain query lacks index:

CREATE PROCEDURE [Performance].[test MissingIndexes]
AS BEGIN

    SET NOCOUNT ON

    DECLARE @msg NVARCHAR(MAX)
          , @rn INT

    SELECT t.text
         , p.query_plan
         , q.total_worker_time / 100000.
    FROM (
        SELECT TOP 100 *
        FROM sys.dm_exec_query_stats
        ORDER BY total_worker_time DESC
    ) q
    CROSS APPLY sys.dm_exec_sql_text(q.sql_handle) t
    CROSS APPLY sys.dm_exec_query_plan(q.plan_handle) p
    WHERE p.query_plan.exist('//*:MissingIndexGroup') = 1

    SET @rn = @@ROWCOUNT
    IF @rn &gt; 0 BEGIN

        SET @msg = 'Missing index in ' + CAST(@rn AS VARCHAR(10)) + ' queries'
        EXEC tSQLt.Fail @msg

    END

END

Also, we can automate search of the unused indexes. It’s quite simple – you just need to know statics of usage of one or another index in dm_db_index_usage_stats:

CREATE PROCEDURE [Performance].[test UnusedUndexes]
AS BEGIN

    DECLARE @tables INT
          , @indexes INT
          , @msg NVARCHAR(MAX)

    SELECT @indexes = COUNT(*)
         , @tables = COUNT(DISTINCT o.[object_id])
    FROM sys.objects o
    CROSS APPLY (
        SELECT s.index_id
             , index_usage = s.user_scans + s.user_lookups + s.user_seeks
             , usage_percent =
                     (s.user_scans + s.user_lookups + s.user_seeks) * 100.
                 /
                     NULLIF(SUM(s.user_scans + s.user_lookups + s.user_seeks) OVER (), 0)
             , index_count = COUNT(*) OVER ()
        FROM sys.dm_db_index_usage_stats s
        WHERE s.database_id = DB_ID()
            AND s.[object_id] = o.[object_id]
    ) t
    WHERE o.is_ms_shipped = 0
        AND o.[schema_id] != SCHEMA_ID('tSQLt')
        AND o.[type] = 'U'
        AND (
                (t.usage_percent &lt; 5 AND t.index_usage &gt; 100 AND t.index_count &gt; 1)
            OR
                t.index_usage = 0
        )

    IF @tables &gt; 0 BEGIN

        SET @msg = 'Database contains ' + CAST(@indexes AS VARCHAR(10))
                 + ' unused indexes in ' + CAST(@tables AS VARCHAR(10)) + ' tables'
        EXEC tSQLt.Fail @msg

    END

END

When developing large and complicated systems, it is a frequent case when a table can be created, filled with data and then forgotten for good.

So, how can such table be determined? For example, there are no references to such tables, and selection from these tables has not taken place since the start of the server, given that the server is working for more than a week. The conditions are relative, and should be adopted for each specific case.

CREATE PROCEDURE [Performance].[test UnusedTables]
AS BEGIN

    SET NOCOUNT ON

    DECLARE @msg NVARCHAR(MAX)
          , @rn INT
          , @txt NVARCHAR(1000) = N'Starting up database ''' + DB_NAME() + '''.'

    DECLARE @database_start TABLE (
        log_date SMALLDATETIME,
        spid VARCHAR(50),
        msg NVARCHAR(4000)
    )

    INSERT INTO @database_start
    EXEC sys.xp_readerrorlog 0, 1, @txt

    SELECT o.[object_id]
         , [object_name] = SCHEMA_NAME(o.[schema_id]) + '.' + o.name
    FROM sys.objects o
    WHERE o.[type] = 'U'
        AND o.is_ms_shipped = 0
        AND o.[schema_id] != SCHEMA_ID('tSQLt')
        AND NOT EXISTS(
                SELECT *
                FROM sys.dm_db_index_usage_stats s
                WHERE s.database_id = DB_ID()
                    AND s.[object_id] = o.[object_id]
                    AND (
                           s.user_seeks &gt; 0
                        OR s.user_scans &gt; 0
                        OR s.user_lookups &gt; 0
                        OR s.user_updates &gt; 0
                    )
            )
        AND NOT EXISTS(
                SELECT *
                FROM sys.sql_expression_dependencies s
                WHERE o.[object_id] IN (s.referencing_id, s.referenced_id)
            )
        AND EXISTS(
                SELECT 1
                FROM @database_start t
                HAVING MAX(t.log_date) &lt; DATEADD(DAY, -7, GETDATE()) ) SET @rn = @@ROWCOUNT IF @rn &gt; 0 BEGIN

        SET @msg = 'Database contains ' + CAST(@rn AS VARCHAR(10)) + ' unused tables'
        EXEC tSQLt.Fail @msg

    END

END

I can create yet more tests similar to the above one)

To sum up, I can recommend trying out tSQLt and SQLQueryStress without any scruples: these products are completely free and have proved to be really useful during heavy load testing of SQL Server and optimization of server productivity.

Speaking of the unit testing software, I would also like to mention dbForge Unit Test – a convenient add-in from Devart for automated unit testing in SQL Server Management Studio. The tool delivers high-quality unit testing and cuts down time spent on the process. Besides, Devart offers a free fully functional trial for the product for 30 days.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Testing Database Performance with tSQLt and SQLQueryStress

SQL SERVER – SqlServerWriter Missing from an Output of VSSadmin List Writers Command

$
0
0

One of a DBA from my client contacted me for a quick suggestion when I was working with them on a performance tuning exercise. I always ask for some time to explain the exact problem and behavior. As per the DBA, they are using 3rd party backup solutions to take backups and SQL backup are not happening. When they contacted 3rd party vendor they asked to run command VSSadmin List Writers and check if they have SqlServerWriter listed. Unfortunately, the component is missing from an output. He asked me the possible reasons. Since this was not a performance tuning related question and I promised that I would share whatever information I can find and write a blog post for that.

When I run it on one of my machine, I get below.

SQL SERVER - SqlServerWriter Missing from an Output of VSSadmin List Writers Command vss-writer-01

There are more Writers available which I am not showing in the image. The important point here is that the client was not seeing below, which I was able to see in my lab machine.

SQL SERVER - SqlServerWriter Missing from an Output of VSSadmin List Writers Command vss-writer-02

CHECKLIST

Based on my search on the internet, I found below possible causes.

  1. Make sure that the SQL Writer Service is installed.
    SQL SERVER - SqlServerWriter Missing from an Output of VSSadmin List Writers Command vss-writer-03

If it’s not started, then start it. If it’s not available, then we need to install it from SQL installation media.

  1. Check Event logs and look for errors.
  2. Make sure that no database in the instance has trailing space in the name.
  3. Make sure service account for SQLWriter is able to connect to ALL instances on the machine.

WORKAROUND / SOLUTION

I was informed by my client that they had permission related issue and after changing the service account of SQL Server they could see the correct output and backups also started working.

Have you seen similar issue and found any solution other than what I listed earlier?

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – SqlServerWriter Missing from an Output of VSSadmin List Writers Command


SQL SERVER – How to Automatically Generate SQL Server Documentation ?

$
0
0

A complete and precise documentation serves a good basis for developers and DBAs during their onboarding to get a general picture of the database they start working with. Creating such a documentation manually takes a lot of time and effort. So you seat for hours and hours digging through the database structure figuring out its functionality, data types stored therein, building relationship diagrams, etc. Moreover, every time changes are introduced to the database, you have to get back to it update the documentation by hand. This definitely feels like a burden and you keep delaying it until the last minute.

However, this tiresome work can be avoided if you use a tool that automates the documentation process. One of such tools is dbForge Documenter for SQL Server from Devart. It is an automated database generation utility that creates a comprehensive and professionally looking documentation in just a few clicks.

This article covers the key features and advantages of Documenter and explains how to use the tool to automatically generate SQL database documentation.

Documenter is an easy-to-use tool and a big time saver when it comes to extracting an extensive data on database objects, properties, inter-object dependencies, and other info. The tool includes a rich set of features for customizing documentation output to meet your specific requirements.

Another great feature of Documenter is that it seamlessly integrates into SQL Server Management Studio (SSMS) so that you can document databases directly from the Object Explorer of your SSMS solution.

Now let’s move from words to deeds and see Documenter in action. The following example demonstrates how to generate documentation for the AdventureWorks2012 sample database.

  1. Start dbForge Documenter for SQL Server. Click New Documenter… on the Start Page.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate1

  1. Select one or several existing connections from the list or create a new one.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate2

  1. After you click the Select button, the Database Documenter project window will open with an object hierarchy structure pane on the left and a preview pane on the right.
  1. The preview pane initially shows the Cover Page of the documentation, where you can switch on breadcrumb navigation links, add a logo, specify a header, title, and descriptive text of the documentation, as well as specify the author and the date of creation.
  1. In the upper right, you can select a style defining the overall appearance of the documentation. Documenter provides a number of style templates and also allows to use various Bootstrap themes to change the look of the documentation. This approach makes the documentation highly customizable in terms of style and presentation.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate3

  1. In the search field of the Structure pane, start typing the database name you want to document. For example, type “Adv…”. As you type, Documenter filters out the databases and displays only the matching ones and highlights the relevant letters of the search text.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate4

  1. Click the AdventureWorks2012 arrow in the structure pane to expand the list of database objects. Documenter retrieves metadata from the database and analyzes its structure.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate5

  1. Once the data have been extracted, the Structure pane displays a tree-view structure of the database entities.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate6

  1. Select the components you want to document. These can be the components at different levels, such as the entire database, or a specific table, or a column of a table.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate7

  1. As you select the components, the preview pane shows details relating to the components under the following sections: Description, Object Types, Properties, Options, Database Files, and Sections To Include. These are the database-level sections.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate8

Documenter allows you to configure elements of documentation on several levels, including:

– servers level;
– server level;
– databases level;
– database level;
– objects group level;
– database object level.

For example, in the structure pane, click the arrow next to Tables to expand the list of available tables and then select the Person.Address table.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate9

The Documenter opens a preview of the table with the following sections: Description, Properties, Columns, Indexes, Foreign Keys, SQL Script, Depends On, Used By. You can exclude any of the listed sections and also specific properties (in the Properties section) for the Person.Address table so that they will not appear in the generated documentation.

  1. A nice and useful feature of Documenter is support of MS_Description, a descriptive text added to database objects by database designers. Documenter pulls this text from the MS_Description extended properties and automatically inserts it into the Description field of a database object. You can edit the descriptions directly in Documenter.
  1. When all settings of the Documenter are configured, click Generate to proceed to the generation of documentation.
  1. In the Generate Documentation dialog box, you should choose a file format for the documentation. HTML format is suitable for databases to be uploaded to a web server. The HTML documentation requires no specific viewer tool (just web browser) and can be instantly shared among a group of people. PDF is good for distributing to various systems and devices. Both formats are searchable, which is very convenient especially for large databases.
  1. Next, specify the folder to store the generated documentation.

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate10

  1. Click the Generate button and enjoy the progress of the database generation.
  1. In a minute, your Adventure Works2012 documentation is ready!

SQL SERVER - How to Automatically Generate SQL Server Documentation ? generate11

Documenter presents documentation in an easy to view format, so you can share it with your boss or clients, other developers, DBAs, testers, project managers, business executives or other related persons.

Conclusion

As you can see, dbForge Documenter for SQL Server fully automates the documentation process and creates a professionally looking technical description of an SQL Server database.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – How to Automatically Generate SQL Server Documentation ?

SQL SERVER – Creating Azure VM for SQL Server Using Portal

$
0
0

I need to admit that Microsoft has been investing big time on the Azure platform and the adoption has been great amongst customers. In the recent past, though I have made several performance tuning activities with customers on SQL Server – many have come back to me asking for consulting into migration into Azure. A lot of conversations have been around lift-and-shift application deployments on an Azure VM (Virtual Machine).

In this blog, let me take you through a step by step approach of initializing a VM on Azure using the portal.

  1. If you are using an Azure subscription, log on from Microsoft Azure.
  1. On the azure Portal, Select Virtual Machines located on the side navigation panel on the Microsoft Azure Management Portal page
  2. Click the +Add button located on the top of the Virtual machines pane.
  3. In the Virtual Machine list, click SQL Server and SQL Server 2016 SP1 Developer on Windows Server 2016.
    SQL SERVER - Creating Azure VM for SQL Server Using Portal create-sql-vm-on-azure-01

Note: You can choose other images with a different edition / version of SQL Server, as required.

  1. Click the Create button in the right pane. Use the “Resource Manager” deployment model.
  2. On the Virtual Machine Configuration Basics Page, complete the fields as follows:
    Virtual Machine Name: SQLAuthSQL01
    New User Name: Choose a secure local Administrator user account to provision.
    New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    Resource Group Name: SQLAuth
    Click OK.

SQL SERVER - Creating Azure VM for SQL Server Using Portal create-sql-vm-on-azure-02

Note: If the VM Name or Resource group name is already taken, choose another name.

  1. On the Virtual Machine Size page, complete the fields / select a configuration that suits you. You will see a series different based on the type of disk that we select. In the earlier configuration we selected SSD and hence this will be different from HDD configuration. Please pick based on your needs.
  2. On the Virtual Machine Configuration for optional features page, leave the default values as it is and click OK.
    PS: I would highly recommend you to take a look at each of the configuration values and read about them. I would write a different blog around some of the extensions later in detail.
  3. On the Virtual Machine Configuration SQL Server Settings page, leave the default values as it is and click OK.
    SQL SERVER - Creating Azure VM for SQL Server Using Portal create-sql-vm-on-azure-03
    Make sure to select the SQL Authentication and use the other defaults.
  4. On the Virtual Machine Summary page, validate the configuration and click OK to start provisioning the VM.

This kick starts the deployment and you will see a progress on the dashboard as shown below.

SQL SERVER - Creating Azure VM for SQL Server Using Portal create-sql-vm-on-azure-04

This completes our steps to create a VM using the Azure portal. In subsequent blogs we will look at some of the settings that you can plan to play around.

 Do let me know via comments if you are using Azure for your needs. Are you using VMs or using the SQL Azure Databases? Let us know how you use them.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Creating Azure VM for SQL Server Using Portal

SQL SERVER – Cannot initialize the data source object of OLE DB provider “Microsoft.ACE.OLEDB.12.0” for linked server

$
0
0

SQL SERVER - Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server errorexit As you might have seen, along with performance tuning related work, I also help community in getting answers to questions and troubleshoot issue with them. A few days ago, one of community leader and user group champion contacted me for assistance. He informed me that they are using Microsoft Access Database Engine 2010 provider to read data from an excel file. But they are seeing below error while doing test connection. Let us learn about error related to OLE DB provider.

Cannot initialize the data source object of OLE DB provider “Microsoft.ACE.OLEDB.12.0” for linked server “EXCEL-DAILY”.
OLE DB provider “Microsoft.ACE.OLEDB.12.0″ for linked server ” EXCEL-DAILY” returned message “Unspecified error”. (Microsoft SQL Server, Error: 7303)

This error appears only from the client machine, but test connection works fine when done on the server. The same error message is visible via SSMS.

EXEC master.dbo.sp_addlinkedserver @server = N'EXCEL-DAILY'
	,@srvproduct = N'Excel'
	,@provider = N'Microsoft.ACE.OLEDB.12.0'
	,@datasrc = N'D:\EMPLOYEE_ATTEN_DATA.xlsx'
	,@provstr = N'Excel 12.0; HDR=Yes'

After a lot of checking, we found that the account which was used from client and the server was different. So, I have captured Process Monitor for both working and non-working test connection. It didn’t take much time to see below in non-working situation.

sqlservr.exe QueryOpen C:\Users\svc_app\AppData\Local\Temp ACCESS DENIED

WORKAROUND/SOLUTION

So, we went to the SQL Server machine and gave full permission to the file path which was listed in process monitor as access denied.  C:\Users\svc_app\AppData\Local\Temp.

After permission was given test connection worked and client machine could read data via excel using linked server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Cannot initialize the data source object of OLE DB provider “Microsoft.ACE.OLEDB.12.0” for linked server

dbForge Studio for SQL Server – Ultimate SQL Server Manager Tool from Devart

$
0
0

dbForge Studio for SQL Server is a powerful IDE for managing, administrating, configuring, developing various components of SQL Server. The tool incorporates various graphical utilities and script editors allowing developers and administrators to access and manage SQL Server. You can download the latest version of dbForge Studio for SQL Server from the Devart website.

Feature Highlights

SQL Coding Assistance features help you get instant information on the SQL code you are typing, prompts context-sensitive suggestions, object member lists, full JOIN clauses, expands INSERT, EXEC, and ALTER statements, wildcards, generates aliases, and much more. Automatic SQL formatting brings your code in line with common coding style standards. There is a great number of built-in SQL snippets that store and reuse repeated code fragments saving you a good deal of time. Quick Object Info displays a complete description of any identifier in your code.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge1

SQL Source Control allows you to manage database changes in source control when developing and deploying an application. You can quickly link an SQL database to one of the supported source control systems: Subversion (SVN), Team Foundation Server (including TFS Cloud), Git (including GitHub), Perforce (P4), Mercurial, SourceGear Vault. Commit local changes to a remote repository, update a local working copy with the latest changes from the source control, view change history, merge files, and resolve conflicts visually in a handy interface.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge2

Index Manager lets you analyze the status of SQL indexes and fix the issues with index fragmentation. You can quickly gather index fragmentation statistics, detect databases that require maintenance, instantly rebuild and reorganize SQL indexes in a visual mode or generate an SQL script for future use.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge3

Unit Test is a tSQLt-based tool that is designed for implementing automated unit testing so that you can develop stable and reliable SQL code that can be properly regression tested at the unit level.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge4

Table Designer user interface is similar to that used in SSMS. It has visual editors for columns, indexes, primary and foreign keys, check constraints, statistics, and table storage properties.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge5

Database Diagram helps you visualize a database. You can create one or more diagrams that illustrate some or all of the tables, columns, keys, and relationships of a specific particular database. dbForge Studio for SQL Server allows you to create as many database diagrams as you want.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge6

Query Profiler provides a graphical representation of the actual and estimated query execution plans. A graphical plan is a commonly used type of an execution plan. The graphical format of both the actual and estimated execution plans makes it easier to analyze them. The detailed data on the plans can be found in ToolTips and Property sheets. The tool allows you to compare different plans while doing some performance tuning.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge7

Schema Compare tool allows you to synchronize database schemas with complex object dependencies. The results of schema comparison are neatly presented in a preview window. In addition, dbForge Studio for SQL Server generates a Data Definition Language (DDL) script that can be used to synchronize the different schemas.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge8

In case you need to move data from one SQL Server instance to another or compare databases with different structures on two remote SQL Server instances, the Data Compare tool is what you need. With this tool, you can easily synchronize table data between servers, analyze data differences and create reports. You can even schedule regular data synchronization and automate the process.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge9

When you develop an application, performing tests under conditions that closely simulate the production environment can be a formidable challenge. If you don’t have meaningful test data in your test environment, it will be hard to predict the application behavior after the release. The built-in Data Generator provides a great choice of predefined generators with sensible configuration options that allow generating column-intelligent and meaningful data.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge10

If you are not too proficient with T-SQL, visual Query Builder will help you create complex queries using only a mouse. The tool provides visual editors for each query clause and automatically creates relationships between tables.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge11

Event Profiler allows you to capture and analyze SQL Server events. The events and data columns will be stored in a physical trace file for later examination. You can use this information to identify and troubleshoot many SQL Server-related problems.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge12

You can easily build SQL reports of any complexity, edit report parameters, customize formatting, calculate summaries in the user-friendly interface of SQL Server Report Builder.

dbForge Studio for SQL Server - Ultimate SQL Server Manager Tool from Devart dbforge13

Conclusion

As you can see, dbForge Studio for SQL Server is a powerful SQL management tool that covers all the important areas of SQL database development, administration, and management. Another great advantage of dbForge Studio for SQL Server is that it is highly affordable. Prices start from 249$, which is awesome!

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on dbForge Studio for SQL Server – Ultimate SQL Server Manager Tool from Devart

SQL SERVER – Fix Error Msg 13603 working with JSON documents

$
0
0

SQL SERVER - Fix Error Msg 13603 working with JSON documents jsonimage-800x800 Working with new data types is something we need to evolve. In this blog, I have written on a number of articles on JSON which you can take a look at. I wanted to write about some of the interesting errors one will get when using JSON custom paths. This is something I stumbled accidentally when using JSON constructs. Let us see how to fix error 13603 while working with JSON documents.

Not everytime the defaults will work when working with databases like SQL Server. You will want to do your own customization and there would be multiple learnings of using the same. Here is a classic example. When working with the FOR JSON PATH, I got the below error:

Msg 13603, Level 16, State 1, Line 1
Property ‘.SQLAuth’ cannot be generated in JSON output due to invalid character in the column name or alias. Column name or alias that contains ‘..’, starts or ends with ‘.’ is not allowed in query that has FOR JSON clause.

This error was generated by the following command shown in the image below:

This is when I realized the mistake I had made. An error is reported when illegal characters are found in the column or column alias used in FOR JSON PATH. The name of one of the column starts or ends with dot.

Remedy: as a corrective action, I fixed column alias to conform to FOR JSON PATH rules.

I wanted to see what the error would be using the same mistake with XML PATH. I wrote the same query as shown:

SELECT 1 as '.SQLAuth',2 as 'SQLAuth..Pinal'
FOR XML PATH

Msg 6850, Level 16, State 1, Line 1
Column name ‘.SQLAuth’ contains an invalid XML identifier as required by FOR XML; ‘.'(0x002E) is the first character at fault.

I can surely see it is similar but slightly different. I am not sure if Microsoft will change this to something similar to JSON error message. But this is good enough in my opinion.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Fix Error Msg 13603 working with JSON documents

SQL SERVER – Installation Error – [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified

$
0
0

One of my blog readers sent an email to me asking for assistance. They were trying to patch SQL Server 2005 in the cluster and the setup was failing to patch database services with below error related to data source name.

SQL Server Setup could not connect to the database service for server configuration. The error was: [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Refer to server error logs and setup logs for more information.

SQL SERVER - Installation Error - [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified setup-data-source-01

If we search for above error, we can see this caused due to various reasons. If it is faced by a developer, then it might be due to incorrect connection string. In our case, it was due to SQL Server setup and there is no connection string involved here.

After looking at various logs, UDL test etc. We found that issue was with SQL Server native client driver. There were multiple old versions of drivers installed on the machine.

WORKAROUND / SOLUTION

To fix this issue, we follow these steps:

  • Uninstall SQL Server Native Client component from the Programs and Features (Add/Remove Programs)
  • Install Microsoft SQL Server Native client for the version which we are installing. In the above case, it was SQL Server 2005.

After following above, we could install SQL cluster successfully. Have you seen a similar error earlier?

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Installation Error – [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified

SQL SERVER – SSMS – Script Out Multiple Objects

$
0
0

Have you ever driven your car out in the direction of your office by mistake when it was supposed to be a day off? Have you ever done things unconsciously because you are getting trained for something that is common? This is what I call as human nature and how our brain works. Our brain is a complex organ and it used patterns and preset paths of execution when it is pre-occupied. This is the path of least resistance and it may sometimes not be the best option too. In this blog post we will learn how to script out multiple objects in SQL Server.

If you are wondering why I am talking about this. Let me give you an example on a tool that I used almost every single day – SQL Server Management Studio (SSMS). I use the Object Explorer and multiple other windows that are available. I get intrigued how this tool works because there is something to learn or unlearn when it comes to working with SSMS.

I was talking with a junior DBA and he wanted to script out multiple objects. My initial reaction was to guide him and he would learn. I told him to do it using SSMS. After an hour he came back and told me it is not possible. It was important for me to show him how. I asked him to do the steps on my PC.

SQL SERVER - SSMS - Script Out Multiple Objects ssms-script-multiple-objects-01

He opened the Object explorer and said he could right click each object and script out. He could use a wizard etc. But he said it was not an easy step. He claimed to have worked with some 3rd party tool in the past that did help him. But I insisted there is something he was missing and I did have an old trick that I wanted to show him.

I opened in the same window the “Object Explorer Details” (F7 shortcut). In this window, I selected multiple objects (based on where the cursor is in Object Explorer and showed how we can export the script.

SQL SERVER - SSMS - Script Out Multiple Objects ssms-script-multiple-objects-02

As shown above, it is a simple script that got generated in one step and he was surprised it was hidden in SSMS. He was trying to use the CTRL, SHIFT keys on the Object Explorer pane to select multiple objects and it failed. He assumed now that it was not possible. He said he is going to explore SSMS further and there were areas that he was not aware.

Have you ever used this feature before? Been there with SSMS for years now. I find it handy and felt this was lesser appreciated feature.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – SSMS – Script Out Multiple Objects


SQL SERVER – Database Mirroring Login Attempt Failed with Error ‘Connection Handshake Failed’

$
0
0

One of my friends was trying to configure database mirroring and was having a hard time in getting things working. The challenge was that he was using certificate for mirroring authentication because machines were workgroup not in domain. He followed one of the blog from Microsoft site but still having problems.

SQL SERVER - Database Mirroring Login Attempt Failed with Error 'Connection Handshake Failed' dbm-01

When he contacted me, he had multiple rounds of failure and gave up. I asked him to share SQL Server ERRORLOG from all 3 servers.

SQL SERVER – Where is ERRORLOG? Various Ways to Find ERRORLOG Location

Here were the messages in ERRORLOG of Principal server.

2016-01-22 04:48:17.13 spid21s Error: 1474, Severity: 16, State: 1.
2016-01-22 04:48:17.13 spid21s Database mirroring connection error 4 ‘An error occurred while receiving data: ‘10054(An existing connection was forcibly closed by the remote host.)’.’ for ‘TCP://Witness:5022’.
2016-01-22 04:48:36.53 spid118 Error: 1456, Severity: 16, State: 3.
2016-01-22 04:48:36.53 spid118 The ALTER DATABASE command could not be sent to the remote server instance ‘TCP://Witness:5022’. The database mirroring configuration was not changed. Verify that the server is connected, and try again.

Here is the error message in the SQL Server ERRORLOG on witness server.

Database Mirroring login attempt failed with error: ‘Connection handshake failed. The login ‘login_mirroring’ does not have CONNECT permission on the endpoint. State 84.’. [CLIENT: xx.xx.xx.xx]

If we note the IP address, they are for principal and mirror. Above message on witness means that the login from the principal and mirror server: login_mirroring did not have CONNECT Permission on the witness endpoint.

SOLUTION / WORKAROUND

We need to grant permission to account which is failing to connect as per message in Errorlog.

GRANT CONNECT ON ENDPOINT::Endpoint_Mirroring TO [login_mirroring]
GO

As soon as above command was run on Witness server, there were not mirroring login failure and mirroring started working fine.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Database Mirroring Login Attempt Failed with Error ‘Connection Handshake Failed’

SQL SERVER – Unable to Start SQL Service – Server TCP provider failed to listen on [‘any’ 1433]. Tcp port is already in use.

$
0
0

While playing with my multiple SQL instances, I realized that I was not able to start one of the SQL instance on my laptop. Here is the error which I received when I tried starting it from Start > Run > Servics.msc . This error is related to TCP provider.

Windows could not start the SQL Server (MSSQLSERVER) on Local Computer. For more information, review the System Event Log. If this is a non-Microsoft service, contact the service vendor, and refer to service-specific error code 10048.

I opened ERRORLOG file and found below message.

2016-12-07 05:39:05.52 spid11s Server is listening on [ ‘any’ 51823].
2016-12-07 05:39:05.52 spid11s Error: 26023, Severity: 16, State: 1.
2016-12-07 05:39:05.52 spid11s Server TCP provider failed to listen on [ ‘any’ 51823]. Tcp port is already in use.

2016-12-07 05:39:05.52 spid11s Error: 17182, Severity: 16, State: 1.
2016-12-07 05:39:05.52 spid11s TDSSNIClient initialization failed with error 0x2740, status code 0xa. Reason: Unable to initialize the TCP/IP listener. Only one usage of each socket address (protocol/network address/port) is normally permitted.

2016-12-07 05:39:05.52 spid11s Error: 17120, Severity: 16, State: 1.
2016-12-07 05:39:05.52 spid11s SQL Server could not spawn FRunCommunicationsManager thread. Check the SQL Server error log and the Windows event logs for information about possible related problems.

Application event logs show below message.

Log Name: Application
Source: MSSQLSERVER
Date: 12/7/2016 5:38:18 AM
Event ID: 26023
Task Category: Server
Level: Error
Keywords: Classic
User: N/A
Computer: sqlserver2016
Description:
Server TCP provider failed to listen on [ ‘any’ 51823]. Tcp port is already in use.

SOLUTION/WORKAROUND

By looking at error messages above, we know that SQL is trying to start on post 51823 and someone else is already using that port. We have two choices at this point.

  1. Find the process which is using 51823 ports and stop that process.
  2. Change port of SQL Server to a port which is not used by any other process.

To find out details about port usage, I generally use a free tool called TCPView. This utility is a sysinternals tool which gives us the information we need to fix this issue. Just start it, pause the data view, and look at which process is using the local port with the TCP protocol. As we can see below PID 3724 is using “Local Port” 51823 which is there in error message.

SQL SERVER - Unable to Start SQL Service - Server TCP provider failed to listen on ['any' <ipv4> 1433]. Tcp port is already in use. port-use-01-800x213

Once we identified know the process that is already using that port, we can take the appropriate action.

If you are not allowed to think in above direction, then you can always change the port in which your SQL Server instance is listening on, but I’d personally be extremely curious as to which process is already using that port.

Configure a Server to Listen on a Specific TCP Port (SQL Server Configuration Manager)

Have you seen such behavior and how did you fix it? Please comment and let me know and share with other blog reader.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Unable to Start SQL Service – Server TCP provider failed to listen on [‘any’ 1433]. Tcp port is already in use.

SQL SERVER – How to get historical deadlock Information from System Health Extended Events?

$
0
0

Let me start off by asking a simple question. How many of you have seen this error about historical deadlock information earlier?

Msg 1205, Level 13, State 45, Line 4
Transaction (Process ID 52) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

I am sure most of us who work with SQL Server would say “Yes!”

Now, next question is – what would you do if you see such error? Few DBAs would say; we need to enable trace flag 1222 and wait for the next occurrence so that information is recorded in the SQL Server ERRORLOG file.

Let me share a piece of advice. There is a session called as System_Health which is created in SQL Server and captured a lot of extended events by default.

SQL SERVER - How to get historical deadlock Information from System Health Extended Events? sys-health-deadlock-01

Here is the script of the session.

CREATE EVENT SESSION [system_health] ON SERVER 
ADD EVENT sqlclr.clr_allocation_failure(
    ACTION(package0.callstack,sqlserver.session_id)),
ADD EVENT sqlclr.clr_virtual_alloc_failure(
    ACTION(package0.callstack,sqlserver.session_id)),
ADD EVENT sqlos.memory_broker_ring_buffer_recorded,
ADD EVENT sqlos.memory_node_oom_ring_buffer_recorded(
    ACTION(package0.callstack,sqlserver.session_id,sqlserver.sql_text,sqlserver.tsql_stack)),
ADD EVENT sqlos.process_killed(
    ACTION(package0.callstack,sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.client_pid,sqlserver.query_hash,sqlserver.session_id,sqlserver.session_nt_username)),
ADD EVENT sqlos.scheduler_monitor_deadlock_ring_buffer_recorded,
ADD EVENT sqlos.scheduler_monitor_non_yielding_iocp_ring_buffer_recorded,
ADD EVENT sqlos.scheduler_monitor_non_yielding_ring_buffer_recorded,
ADD EVENT sqlos.scheduler_monitor_non_yielding_rm_ring_buffer_recorded,
ADD EVENT sqlos.scheduler_monitor_stalled_dispatcher_ring_buffer_recorded,
ADD EVENT sqlos.scheduler_monitor_system_health_ring_buffer_recorded,
ADD EVENT sqlos.wait_info(
    ACTION(package0.callstack,sqlserver.session_id,sqlserver.sql_text)
    WHERE ([duration]>(15000) AND ([wait_type]>=N'LATCH_NL' AND ([wait_type]>=N'PAGELATCH_NL' AND [wait_type]<=N'PAGELATCH_DT' OR [wait_type]<=N'LATCH_DT' OR [wait_type]>=N'PAGEIOLATCH_NL' AND [wait_type]<=N'PAGEIOLATCH_DT' OR [wait_type]>=N'IO_COMPLETION' AND [wait_type]<=N'NETWORK_IO' OR [wait_type]=N'RESOURCE_SEMAPHORE' OR [wait_type]=N'SOS_WORKER' OR [wait_type]>=N'FCB_REPLICA_WRITE' AND [wait_type]<=N'WRITELOG' OR [wait_type]=N'CMEMTHREAD' OR [wait_type]=N'TRACEWRITE' OR [wait_type]=N'RESOURCE_SEMAPHORE_MUTEX') OR [duration]>(30000) AND [wait_type]<=N'LCK_M_RX_X'))), ADD EVENT sqlos.wait_info_external( ACTION(package0.callstack,sqlserver.session_id,sqlserver.sql_text) WHERE ([duration]>(5000) AND ([wait_type]>=N'PREEMPTIVE_OS_GENERICOPS' AND [wait_type]<=N'PREEMPTIVE_OS_ENCRYPTMESSAGE' OR [wait_type]>=N'PREEMPTIVE_OS_INITIALIZESECURITYCONTEXT' AND [wait_type]<=N'PREEMPTIVE_OS_QUERYSECURITYCONTEXTTOKEN' OR [wait_type]>=N'PREEMPTIVE_OS_AUTHZGETINFORMATIONFROMCONTEXT' AND [wait_type]<=N'PREEMPTIVE_OS_REVERTTOSELF' OR [wait_type]>=N'PREEMPTIVE_OS_CRYPTACQUIRECONTEXT' AND [wait_type]<=N'PREEMPTIVE_OS_DEVICEOPS' OR [wait_type]>=N'PREEMPTIVE_OS_NETGROUPGETUSERS' AND [wait_type]<=N'PREEMPTIVE_OS_NETUSERMODALSGET' OR [wait_type]>=N'PREEMPTIVE_OS_NETVALIDATEPASSWORDPOLICYFREE' AND [wait_type]<=N'PREEMPTIVE_OS_DOMAINSERVICESOPS' OR [wait_type]=N'PREEMPTIVE_OS_VERIFYSIGNATURE' OR [duration]>(45000) AND ([wait_type]>=N'PREEMPTIVE_OS_SETNAMEDSECURITYINFO' AND [wait_type]<=N'PREEMPTIVE_CLUSAPI_CLUSTERRESOURCECONTROL' OR [wait_type]>=N'PREEMPTIVE_OS_RSFXDEVICEOPS' AND [wait_type]<=N'PREEMPTIVE_OS_DSGETDCNAME' OR [wait_type]>=N'PREEMPTIVE_OS_DTCOPS' AND [wait_type]<=N'PREEMPTIVE_DTC_ABORT' OR [wait_type]>=N'PREEMPTIVE_OS_CLOSEHANDLE' AND [wait_type]<=N'PREEMPTIVE_OS_FINDFILE' OR [wait_type]>=N'PREEMPTIVE_OS_GETCOMPRESSEDFILESIZE' AND [wait_type]<=N'PREEMPTIVE_ODBCOPS' OR [wait_type]>=N'PREEMPTIVE_OS_DISCONNECTNAMEDPIPE' AND [wait_type]<=N'PREEMPTIVE_CLOSEBACKUPMEDIA' OR [wait_type]=N'PREEMPTIVE_OS_AUTHENTICATIONOPS' OR [wait_type]=N'PREEMPTIVE_OS_FREECREDENTIALSHANDLE' OR [wait_type]=N'PREEMPTIVE_OS_AUTHORIZATIONOPS' OR [wait_type]=N'PREEMPTIVE_COM_COCREATEINSTANCE' OR [wait_type]=N'PREEMPTIVE_OS_NETVALIDATEPASSWORDPOLICY' OR [wait_type]=N'PREEMPTIVE_VSS_CREATESNAPSHOT')))), ADD EVENT sqlserver.connectivity_ring_buffer_recorded(SET collect_call_stack=(1)), ADD EVENT sqlserver.error_reported( ACTION(package0.callstack,sqlserver.database_id,sqlserver.session_id,sqlserver.sql_text,sqlserver.tsql_stack) WHERE ([severity]>=(20) OR ([error_number]=(17803) OR [error_number]=(701) OR [error_number]=(802) OR [error_number]=(8645) OR [error_number]=(8651) OR [error_number]=(8657) OR [error_number]=(8902) OR [error_number]=(41354) OR [error_number]=(41355) OR [error_number]=(41367) OR [error_number]=(41384) OR [error_number]=(41336) OR [error_number]=(41309) OR [error_number]=(41312) OR [error_number]=(41313)))),
ADD EVENT sqlserver.security_error_ring_buffer_recorded(SET collect_call_stack=(1)),
ADD EVENT sqlserver.sp_server_diagnostics_component_result(SET collect_data=(1)
    WHERE ([sqlserver].[is_system]=(1) AND [component]<>(4))),
ADD EVENT sqlserver.sql_exit_invoked(
    ACTION(package0.callstack,sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.client_pid,sqlserver.query_hash,sqlserver.session_id,sqlserver.session_nt_username)),
ADD EVENT sqlserver.xml_deadlock_report
ADD TARGET package0.event_file(SET filename=N'system_health.xel',max_file_size=(5),max_rollover_files=(4)),
ADD TARGET package0.ring_buffer(SET max_events_limit=(5000),max_memory=(4096))
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=120 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=ON)
GO

There is no need to worry about a lot of information there. The information which I want to highlight is below.

ADD EVENT sqlserver.xml_deadlock_report

Above means that by default SQL Server should be capturing XML deadlock graph. To check this I have done little experiment.

Create Database and Objects

First create a database and two tables.

CREATE DATABASE DeadlockDemo
GO
USE DeadlockDemo
GO
CREATE TABLE MyT1 (i INT)
GO
CREATE TABLE MyT2 (i INT)
GO
INSERT INTO MyT1
VALUES (1)
GO
INSERT INTO MyT2
VALUES (1)
GO

Reproduce Deadlock Error

We need to connections to reproduce deadlock.

Connection ID Query
Connection # 1 begin tran
Connection # 2 begin tran
Connection # 1 update MyT1 set i= 3
Connection # 2 update MyT2 set i= 3
Connection # 1 Select * from MyT2
Connection # 2 Select * from MyT1

Once above steps are complete, once of the connection would get deadlock error “Transaction (Process ID 52) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.”

Look at the deadlock graph from System health session:

Here is the query which we can use to read deadlock graph which is captured in system health session.

SELECT XEvent.query('(event/data/value/deadlock)[1]') AS DeadlockGraph
FROM (
	SELECT XEvent.query('.') AS XEvent
	FROM (
		SELECT CAST(target_data AS XML) AS TargetData
		FROM sys.dm_xe_session_targets st
		INNER JOIN sys.dm_xe_sessions s ON s.address = st.event_session_address
		WHERE s.NAME = 'system_health'
			AND st.target_name = 'ring_buffer'
		) AS Data
CROSS APPLY TargetData.nodes('RingBufferTarget/event[@name="xml_deadlock_report"]') AS XEventData(XEvent)
) AS source;

And we will get same graph which we get via trace flag 1222 in the error log.

Do above experiment and share your experience via the comments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – How to get historical deadlock Information from System Health Extended Events?

SQL SERVER – Query to Get the List of Logins Having System Admin (sysadmin) Permission

$
0
0

Though the script sounded simple to me, but I found that there are lots of incorrect scripts available on the internet.  Here is the one of the script I found to find out the details of the system admin.

Please note that following script is not accurate and I do not encourage you to depend on it. You will find the correct script at the end of this article, please continue reading till the end of the blog post.

SELECT   name,type_desc,is_disabled
FROM     master.sys.server_principals 
WHERE    IS_SRVROLEMEMBER ('sysadmin',name) = 1
ORDER BY name

Above script looks very simple. When I ran on my machine I got below.

SQL SERVER - Query to Get the List of Logins Having System Admin (sysadmin) Permission who-sysadmin-01

I realized that some entries are missing. So, I went ahead and checked the properties of SysAdmin role and found below

SQL SERVER - Query to Get the List of Logins Having System Admin (sysadmin) Permission who-sysadmin-02

As we can see, I am not seeing all 6 members in the output. So, here is the query which I was able to write which would give accurate information.

SELECT 'Name' = sp.NAME
	,sp.is_disabled AS [Is_disabled]
FROM sys.server_role_members rm
	,sys.server_principals sp
WHERE rm.role_principal_id = SUSER_ID('Sysadmin')
	AND rm.member_principal_id = sp.principal_id

Here is the output, which is accurate.

SQL SERVER - Query to Get the List of Logins Having System Admin (sysadmin) Permission who-sysadmin-03

Do you have any similar interesting queries? Please share them with other readers via the comments section.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Query to Get the List of Logins Having System Admin (sysadmin) Permission

SQL SERVER – Why Cluster Network is Unavailable in Failover Cluster Manager?

$
0
0

It’s always a good experience to visit customer sites and talk to people. Sometimes I get to see things outside SQL world as well. There is a lot to learn and I believe that I can do that by sharing what I learned. In this blog post we will discuss Why Cluster Network is Unavailable in Failover Cluster Manager?

During my last visit to an India based company, I was talking to a windows admin during lunch and he was talking about a cluster issue. It was an interesting conversation where he told that sometimes a reboot is THE solution to solve a problem. He told me an incident where Cluster networks were shown as unavailable in failover cluster manager. After lunch, I went to his desk to get more details.

SQL SERVER - Why Cluster Network is Unavailable in Failover Cluster Manager? cluster-down-800x407

As we can see under box created around Nodes, this was only with one node.

When we look at cluster logs, we see below the messages.


========B02===========
00000648.00002464::2016/11/29-08:58:45.173 INFO [FTI][Initiator] This node (1) is initiator
00000648.00002464::2016/11/29-08:58:45.173 WARN [FTI][Initiator] Ignoring duplicate connection: usable route already exists
00000648.00002464::2016/11/29-08:58:45.173 INFO [CHANNEL 147.170.123.251:~3343~] graceful close, status (of previous failure, may not indicate problem) ERROR_SUCCESS(0)
00000648.00002464::2016/11/29-08:58:45.174 WARN cxl::ConnectWorker::operator (): GracefulClose(1226)’ because of ‘channel to remote endpoint 147.170.123.251:~3343~ is closed’


========B01============
00004090.00005db0::2016/11/29-08:58:45.157 INFO [FTI][Follower] This node (2) is not the initiator
00004090.00005db0::2016/11/29-08:58:45.157 DBG [FTI] Stream already exists to node 1: false
00004090.00005db0::2016/11/29-08:58:45.157 DBG [CHANNEL 147.170.123.252:~54783~] Close().
00004090.00005db0::2016/11/29-08:58:45.157 INFO [CHANNEL 147.170.123.252:~54783~] graceful close, status (of previous failure, may not indicate problem) ERROR_SUCCESS(0)
00004090.00005db0::2016/11/29-08:58:45.157 INFO [CORE] Node 2: Clearing cookie 63cfe37d-42be-4211-8cd8-6db6b3344b52
00004090.00005db0::2016/11/29-08:58:45.157 DBG [CHANNEL 147.170.123.252:~54783~] Not closing handle because it is invalid.
00004090.00005db0::2016/11/29-08:58:45.157 WARN mscs::ListenerWorker::operator (): GracefulClose(1226)’ because of ‘channel to remote endpoint 147.170.123.252:~54783~ is closed’

Based on cluster logs and highlighted message “Ignoring duplicate connection: usable route already exists, we can say that this issue is caused due to stale information on network from rejecting node.

The only solution to fix the error was to reboot the active node.

I search on internet and found that this could be because of real network issue, some antivirus software as well. So, if above message is not shown in cluster log, then you can search further. Please share the solution if you find.

Reference: Pinal Dave (http://blog.sqlauthority.com)

First appeared on SQL SERVER – Why Cluster Network is Unavailable in Failover Cluster Manager?

Viewing all 594 articles
Browse latest View live