![]() |
how to find the duplicate urls within tousands
i have a .txt file with tousands of urls. now my script has reported that there are 20 duplicate urls but it cant tell which ones are they. is there a script or a software that can find and list the duplicates for me so i can fix them properly ?
|
Use the search function in notepad. Start at the top.
Study Visual Basic and write a program. There's a couple of ideas. Hope someone has something a bit more direct. |
Links Suite 4
|
if you know php or some other scripting language then it's pretty damn easy.
One method: split it into an array loop through the array check if you've already seen that url, if not then remember it print out a list of urls that you've seen probbaly a cleaner way though I'm lazy |
Go to download.com and get one of the super free notepad programs
and do a search and you'll find them really fast. |
From shell:
Code:
sort file.txt|uniq -d |
All times are GMT -7. The time now is 07:01 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123