How to remove duplicated rows in an Excel 2019/365/2016 spreadsheet?

Here’s a typical data analysis question we get from time to time:

I have a spreadsheet with over 100,000 rows Excel spreadsheet that has a bunch of customer information which i need to clean up for duplicates. I would estimate that it will take a me a few days to scrub the list manually. A colleague told me that most probably i will need an Excel Macro to get that done. Our IT guy, who is familiar with Macro development, is on leave so i need to solve this on my own. Is there a way to solve this problem without knowing how to program?

Yes, there is a way to solve this in Excel, and it won’t require to be profficient with Visual Basic programming. Let’s go ahead and solve this real quick for you.

Delete Excel duplicated rows

Here’s how to get rid of the unneeded entries in your spreadsheet:

  1. First off, make a copy of your workbook, so that you have a backup in case anything goes wrong.
  2. Next open, your Excel spreadsheet.
  3. In the upper Ribbon, hit the Data tab.
  4. Now go ahead and select the data range you would like to clean up. Make sure that you select the table headers if available as well.
  5. In the Data Tools section hit the Remove Duplicates button.
  6. Now, go ahead and check the relevant column that will help you determine whether an entry in your list is duplicated. In our case, using our demo Sales Targets data set, we’ll use the first 3 columns to identify duplicated rows.


  1. Hit OK.
  2. A message detailing the number of removed entries and the ones kept in the table will be displayed.


  1. Hit OK and inspect your modified data set for correctness.
  2. If satisfied with the result, go ahead and save your file. Otherwise you can undo your change (Edit>>Undo).

Advanced Tips:

  1. Similar capability is available in Excel PowerQuery, which allows you to transform significant larger data sets.
  2. If you are conversant with the Python language, you can easily drop duplicated rows from a data set using the Pandas library.

Enjoy your data analysis 😉