en-UShe-IL
You are here:   Blog
Register   |  Login

Blog Archive:

Maximize
* Can be used in order to search for older blogs Entries

Search in blogs


Blog Categories:

Maximize
* Can be used in order to search for blogs Entries by Categories

Blog Tags:

Maximize
* Can be used in order to search for blogs by keywords

TNWikiSummit


Awared MVP

 


Microsoft® Community Contributor 


Microsoft® Community Contributor


 Read first, before you you use the blog! Maximize

Recent Entries

Minimize
ינו7

Written by: ronen ariely
07/01/2020 18:13 RssIcon

This is a simple compare I do for the sake of a question in the forums. Since I don't have time now, I will add other solutions and dive into better and I will check further comparisons in the future. At this time I will add the result of two suggested solutions.

* SQL Server 2019; Very poor Laptop with 8 GB RAM.

Note! I do not rule out the possibility that further tests or different tests or tests under different conditions will show different results. The conclusions I present are in accordance with the basic tests I have conducted so far!

Preparation - DDL+DML

DROP TABLE IF EXISTS T1;
GO
CREATE TAble T1 (txt NVARCHAR(100))
GO
 
INSERT T1 (txt)
SELECT top 10000
    LEFT (REPLACE(CAST (NEWID () AS NVARCHAR(MAX)),'-','') , ABS (CHECKSUM (NEWID ())) % 10 + 1)
    + '-' + CONVERT(NVARCHAR(100), ABS(CHECKSUM(NewId())) % 100000) + '-'
    + LEFT (REPLACE(CAST (NEWID () AS NVARCHAR(MAX)),'-','') , ABS (CHECKSUM (NEWID ())) % 10 + 1)
FROM sys.all_objects a1
CROSS APPLY sys.all_objects a2
GO
 
SELECT * FROM T1
GO

Option 1: Using PARSENAME (Solution provided by me Ronen Ariely)

SELECT PARSENAME(REPLACE (txt,'-','.'),2)
from T1
GO

.

Option 2: Using JSON (Solution based on Shnugo code)

Note! In the original thread Shnugo provided (1) solution for a single parameter, (2) solution to find all the parts of the text. In order to make the test and answer the requirements of the original question for the sake of the compare, I simply used CROSS APPLY and remove the parse of the other parts of the text

Shnugo's Original solution

DECLARE @SerialNumber1 VARCHAR(10) = 'X-15-Y';
SELECT A.*
FROM OPENJSON(CONCAT('[["',REPLACE(@SerialNumber1,'-','","'),'"]]'))
WITH (FirstItem NVARCHAR(10) '$[0]'
     ,SecondItem INT '$[1]'
     ,ThirdItem NVARCHAR(10) '$[2]') A;
GO

-- The request was to get the number part only and not all parts.
-- Getting all parts change the case and the best solution accordingly.

The code which I used for the comparison, which is based on the above code

SELECT A.SecondItem
FROM T1
CROSS APPLY OPENJSON(CONCAT('[["',REPLACE(T1.txt,'-','","'),'"]]'))
WITH (SecondItem INT '$[1]') A;
GO

.

Let's start with comparing the Execution Plan

Execution Plan

Comparing IO

As we can expect from the Execution plan both solution use the same IO.

In order to compare IO we SET THE STATISTICS IO to be ON

SET STATISTICS IO ON
GO

Result is:

Table 'T1'. Scan count 1, logical reads 61, physical reads 0, page server reads 0, read-ahead reads 0, page server read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob page server reads 0, lob read-ahead reads 0, lob page server read-ahead reads 0.
Completion time: 2020-01-07T18:19:03.8067881+02:00

Comparing Time

In order to compare time We SET STATISTICS TIME to be ON

SET STATISTICS TIME ON
GO

Since I uses a small SET of data, for both solutions I got zero compile time:

SQL Server parse and compile time:
   CPU time = 0 ms, elapsed time = 0 ms.

But there was a huge different in the SQL Server Execution Times!

For the first solution (using PARSENAME) I got CPU time between 15 and 62 ms, and I got elapsed time between 160-182 ms. 

For the second solution (using JSON) I got CPU time between 62 and 125 ms, and I got elapsed time between 177-249 ms.

Note! I executed the solutions without clearing cache (only before the first execution).

Note! I executed these solutions over 100 times and stored the min and max values.

References