02-19-2014, 03:28 PM
|
|
|
It's 42
Industry Role:
Join Date: Jun 2010
Location: Global
Posts: 18,083
|
Quote:
~:$cd /home/user/directory-where-the-files-are/
/home/user/directory-where-the-files-are/:$cat file1 file2 > bigfile.csv
/home/user/directory-where-the-files-are/:$sort | bigfile.csv uniq > sortedfile.csv
|
Use your LINUX webserver in SSH or a LINUX computer terminal for this.
Google has some excel solutions (for the point an click crowd)...
https://www.google.com/search?q=remo...la8&oe=ut f-8
Quote:
wc -l bigfile.csv
47784 bigfile.csv
wc -l sortfile.csv
29466 sortfile.csv
|
My 'bigfile.csv' has 47784 lines. Sorting out -18318 duplicate lines took less than 2 seconds -- touch that Excel -- eat my dust!  
|
|
|