![]() |
![]() |
![]() |
||||
Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
![]() ![]() |
|
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
Thread Tools |
![]() |
#1 |
Confirmed User
Join Date: Nov 2001
Location: Redmond, WA
Posts: 2,727
|
Server Question
I am soooo not a server guy but I need to get a list of any/all .php files on my server. There are lots of sub-directories and the pages on the server are not linked together.
Anyone have a php script that can do this? I don't want to have to exclude any files or folders by hand (excludes off the shelf solutions) and the print-out needs to write out the full url for me (such as) http://www.domain.com/foo/foo1/foo.php http://www.domain.com/foo2/foo1/foo/foo2.php and not just foo.php and foo2.php There's probably a simple /find *.php |grep all -print dummydoesntknowunix.txt command i could use too - so that would work as well Help is appreciated! |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#2 |
Do Fun Shit.
Industry Role:
Join Date: Dec 2004
Location: OC
Posts: 13,393
|
checkout googles python site map script... i've used it to scrap active pages a couple of times
__________________
![]() “I have the simplest tastes. I am always satisfied with the best.” -Oscar Wilde |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#3 |
Confirmed User
Join Date: Oct 2007
Location: South Florida - "Fuck Bitches and Fight Crime - It's all we do!
Posts: 629
|
maybe start simple via ssh
locate .php or use a wildcard for the search. I'am sure there are prettier ways ... my ssh is rusty, been spoiled w/ developers and admins for long time.
__________________
*** Andrew Love *** Services: (1) Web Design, (2) Graphic Design, (3) E-commerce, (4) Database Development, (5) Web Analytics, (6) Targeted Lead Generation, (7) Marketing and Consulting Mainstream: Successful Website Landing Page Design | ICQ: 360-83-9627 |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#4 |
Registered User
Join Date: Jan 2005
Location: Phoenix, AZ
Posts: 66
|
find / -name "*.php" -print > /tmp/phpfilelist.txt
or find / -name "*.php" -print | more locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob. |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#5 | |
Confirmed User
Industry Role:
Join Date: Mar 2007
Posts: 7,771
|
Quote:
__________________
![]() ![]() ![]() |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#6 |
Confirmed User
Join Date: Oct 2008
Posts: 244
|
Ok, I am really not a tech - however, here is something i put together, remember to updatedb first.
Some edits will have to be done for you case ![]() locate *.php | sed 's/\/var\/www\/html/www.domain.com/' or do locate *.php | sed 's/\/var\/www\/html/www.domain.com/' >> file.txt I really wanted to use a recursive LS but couldn't figure out a way to save my life to pull from ls -RD lol
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com ![]() Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#7 |
Confirmed User
Join Date: Oct 2008
Posts: 244
|
did that work for you?
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com ![]() Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#8 |
Confirmed User
Join Date: Apr 2006
Location: Pornyland
Posts: 789
|
The binary is called "updatedb". It updates the slocate database, which the binary "locate" uses later.
__________________
<sig spot goes here> |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#9 |
Confirmed User
Join Date: Oct 2008
Posts: 244
|
This would just give him the file locations not the list like he was needing - gotta use a sed in there.
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com ![]() Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#10 |
Confirmed User
Join Date: Nov 2001
Location: Redmond, WA
Posts: 2,727
|
Actually, tech support over at YellowFiber did it for me!
find / -name '*.php' > list.txt This gave me the list as /user/home/domain.com/foo/foo.php then I did a search/replace /user/home/ with http:// and that gave me the list in the format I needed ![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#11 |
Confirmed User
Join Date: Oct 2008
Posts: 244
|
But it doesnt look as cool as:
locate *.php | sed 's/\/var\/www\/html/www.domain.com/' >> file.txt But i am glad you got it working ![]()
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com ![]() Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#12 |
Confirmed User
Join Date: Nov 2001
Location: Redmond, WA
Posts: 2,727
|
Thanks for the suggestions tho - very much appreciated
|
![]() |
![]() ![]() ![]() ![]() ![]() |