从SQL表中删除重复的行(基于多个列的值) [英] Removing duplicate rows (based on values from multiple columns) from SQL table

查看:173
本文介绍了从SQL表中删除重复的行(基于多个列的值)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下SQL表:

AR_Customer_ShipTo

+--------------+------------+-------------------+------------+
| ARDivisionNo | CustomerNo |   CustomerName    | ShipToCode |
+--------------+------------+-------------------+------------+
|           00 | 1234567    | Test Customer     |          1 |
|           00 | 1234567    | Test Customer     |          2 |
|           00 | 1234567    | Test Customer     |          3 |
|           00 | ARACODE    | ARACODE Customer  |          1 |
|           00 | ARACODE    | ARACODE Customer  |          2 |
|           01 | CBE1EX     | Normal Customer   |          1 |
|           02 | ZOCDOC     | Normal Customer-2 |          1 |
+--------------+------------+-------------------+------------+

(ARDivisionNo,CustomerNo,ShipToCode)形成此表的主键。

如果您注意到前3行属于同一客户(测试客户),则具有不同的ShipToCode:1,2和3.与第二个客户(ARACODE顾客)。每个普通客户和普通客户-2只有1个记录,单个 ShipToCode

If you notice first 3 rows belong to same customer (Test Customer), who has different ShipToCodes: 1, 2 and 3. Similar is the case with second customer (ARACODE Customer). Each of Normal Customer and Normal Customer-2 has only 1 record with a single ShipToCode.

现在,我想在这个表上获得结果查询,我每个客户只有1条记录。所以,对于任何客户,如果有超过1条记录,我想保留最高价值的记录 ShipToCode

Now, I would like to get result querying on this table, where I will have only 1 record per customer. So, for any customer, where there are more than 1 records, I would like to keep the record with highest value for ShipToCode.

我尝试了各种各样的事情:

I tried various things:

(1)我可以轻松地获得表中只有一条记录的客户列表。

(1) I can easily get the list of customers with only one record in table.

(2)通过以下查询,我可以获取所有客户的列表,他们在表中有多个记录。

(2) With following query, I am able to get the list of all the customers, who have more than one record in the table.

[Query-1]

SELECT ARDivisionNo, CustomerNo
FROM AR_Customer_ShipTo 
GROUP BY ARDivisionNo, CustomerNo
HAVING COUNT(*) > 1;

(3)现在,为了选择正确的 ShipToCode 对于上述查询返回的每个记录,我无法弄清楚,如何迭代通过上面查询返回的所有记录。

(3) Now, in order to select proper ShipToCode for each record returned by above query, I am not able to figure out, how to iterate through all the records returned by above query.

如果我这样做:

[Query-2] p>

[Query-2]

SELECT TOP 1 ARDivisionNo, CustomerNo, CustomerName, ShipToCode  
FROM AR_Customer_ShipTo 
WHERE ARDivisionNo = '00' and CustomerNo = '1234567'
ORDER BY ShipToCode DESC

然后我可以获得适当的记录( 00-1234567 - 测试客户)。因此,如果我可以在上述查询(query-2)中使用query-1的所有结果,那么我可以为具有多个记录的客户获取所需的单个记录。这可以与点(1)的结果相结合,以达到所需的最终结果。

Then I can get the appropriate record for (00-1234567-Test Customer). Hence, if I can use all the results from query-1 in the above query (query-2), then I can get the desired single records for customers with more than one record. This can be combined with results from point (1) to achieve the desired end result.

再次,这可以比我关注的方法更容易。请让我知道如何做到这一点。

Again, this can be easier than approach I am following. Please let me know how can I do this.

[注意:我只需要使用SQL查询即可。我不能使用存储过程,因为我将使用'Scribe Insight'来执行这个东西,只允许我写查询。]

[Note: I have to do this using SQL queries only. I cannot use stored procedures, as I am going to execute this thing finally using 'Scribe Insight', which only allows me to write queries.]

推荐答案

SQL FIDDLE示例

1)使用CTE根据ARDivisionNo,CustomerNo
为每个客户获取最大船舶代码值记录

1) Use CTE to get max ship code value record based on ARDivisionNo, CustomerNo for each Customers

WITH cte AS (
  SELECT*, 
     row_number() OVER(PARTITION BY ARDivisionNo, CustomerNo ORDER BY ShipToCode desc) AS [rn]
  FROM t
)
Select * from cte WHERE [rn] = 1

2)删除记录使用删除查询而不是选择并将其中的条款更改为rn> 1. 示例SQL FIDDLE

2) To Delete the record use Delete query instead of Select and change Where Clause to rn > 1. Sample SQL FIDDLE

WITH cte AS (
  SELECT*, 
     row_number() OVER(PARTITION BY ARDivisionNo, CustomerNo ORDER BY ShipToCode desc) AS [rn]
  FROM t
)
Delete from cte WHERE [rn] > 1;

select * from t;

这篇关于从SQL表中删除重复的行(基于多个列的值)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆