Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar Mark Forums Read
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 12-30-2008, 07:46 PM   #1
Bama
Confirmed User
 
Join Date: Nov 2001
Location: Redmond, WA
Posts: 2,727
Server Question

I am soooo not a server guy but I need to get a list of any/all .php files on my server. There are lots of sub-directories and the pages on the server are not linked together.

Anyone have a php script that can do this? I don't want to have to exclude any files or folders by hand (excludes off the shelf solutions) and the print-out needs to write out the full url for me (such as)

http://www.domain.com/foo/foo1/foo.php
http://www.domain.com/foo2/foo1/foo/foo2.php

and not just foo.php and foo2.php

There's probably a simple /find *.php |grep all -print dummydoesntknowunix.txt command i could use too - so that would work as well

Help is appreciated!
Bama is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-30-2008, 07:48 PM   #2
munki
Do Fun Shit.
 
munki's Avatar
 
Industry Role:
Join Date: Dec 2004
Location: OC
Posts: 13,393
checkout googles python site map script... i've used it to scrap active pages a couple of times
__________________

I have the simplest tastes. I am always satisfied with the best.” -Oscar Wilde
munki is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-30-2008, 07:51 PM   #3
skrinkladoo
Confirmed User
 
skrinkladoo's Avatar
 
Join Date: Oct 2007
Location: South Florida - "Fuck Bitches and Fight Crime - It's all we do!
Posts: 629
maybe start simple via ssh

locate .php

or use a wildcard for the search.
I'am sure there are prettier ways ... my ssh is rusty, been spoiled w/ developers and admins for long time.
__________________
*** Andrew Love ***

Services: (1) Web Design, (2) Graphic Design, (3) E-commerce, (4) Database Development,
(5) Web Analytics, (6) Targeted Lead Generation, (7) Marketing and Consulting

Mainstream: Successful Website Landing Page Design | ICQ: 360-83-9627
skrinkladoo is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-30-2008, 08:47 PM   #4
netpimp
Registered User
 
netpimp's Avatar
 
Join Date: Jan 2005
Location: Phoenix, AZ
Posts: 66
find / -name "*.php" -print > /tmp/phpfilelist.txt

or

find / -name "*.php" -print | more


locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob.
netpimp is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-30-2008, 08:48 PM   #5
sortie
Confirmed User
 
sortie's Avatar
 
Industry Role:
Join Date: Mar 2007
Posts: 7,771
Quote:
Originally Posted by Bama View Post
I am soooo not a server guy but I need to get a list of any/all .php files on my server. There are lots of sub-directories and the pages on the server are not linked together.

Anyone have a php script that can do this? I don't want to have to exclude any files or folders by hand (excludes off the shelf solutions) and the print-out needs to write out the full url for me (such as)

http://www.domain.com/foo/foo1/foo.php
http://www.domain.com/foo2/foo1/foo/foo2.php

and not just foo.php and foo2.php

There's probably a simple /find *.php |grep all -print dummydoesntknowunix.txt command i could use too - so that would work as well

Help is appreciated!
Where are you hosted?
__________________
sortie is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-30-2008, 09:48 PM   #6
PR|Jordan
Confirmed User
 
Join Date: Oct 2008
Posts: 244
Ok, I am really not a tech - however, here is something i put together, remember to updatedb first.

Some edits will have to be done for you case
locate *.php | sed 's/\/var\/www\/html/www.domain.com/'

or do

locate *.php | sed 's/\/var\/www\/html/www.domain.com/' >> file.txt
I really wanted to use a recursive LS but couldn't figure out a way to save my life to pull from ls -RD lol
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com
These deals ROCK!
Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month
Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month
PR|Jordan is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-31-2008, 09:05 AM   #7
PR|Jordan
Confirmed User
 
Join Date: Oct 2008
Posts: 244
did that work for you?
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com
These deals ROCK!
Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month
Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month
PR|Jordan is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-31-2008, 09:17 AM   #8
million
Confirmed User
 
Join Date: Apr 2006
Location: Pornyland
Posts: 789
Quote:
Originally Posted by netpimp View Post
find / -name "*.php" -print > /tmp/phpfilelist.txt

or

find / -name "*.php" -print | more


locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob.
The binary is called "updatedb". It updates the slocate database, which the binary "locate" uses later.
__________________
<sig spot goes here>
million is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-31-2008, 01:22 PM   #9
PR|Jordan
Confirmed User
 
Join Date: Oct 2008
Posts: 244
Quote:
Originally Posted by netpimp View Post
find / -name "*.php" -print > /tmp/phpfilelist.txt

or

find / -name "*.php" -print | more


locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob.
This would just give him the file locations not the list like he was needing - gotta use a sed in there.
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com
These deals ROCK!
Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month
Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month
PR|Jordan is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-31-2008, 03:45 PM   #10
Bama
Confirmed User
 
Join Date: Nov 2001
Location: Redmond, WA
Posts: 2,727
Actually, tech support over at YellowFiber did it for me!

find / -name '*.php' > list.txt

This gave me the list as /user/home/domain.com/foo/foo.php then I did a search/replace /user/home/ with http:// and that gave me the list in the format I needed
Bama is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-31-2008, 03:53 PM   #11
PR|Jordan
Confirmed User
 
Join Date: Oct 2008
Posts: 244
But it doesnt look as cool as:

locate *.php | sed 's/\/var\/www\/html/www.domain.com/' >> file.txt

But i am glad you got it working
__________________
Jordan Jacobs - OC3 Networks / Pacific Rack - ICQ:1408 1732 - jjacobs[@]pacificrack.com
These deals ROCK!
Dual Core E2180, 2GB DDR2, 250GB SATA-2, 2000GB - $89 /month
Quad Core Q9300, 2GB DDR2, 250GB SATA-2, 2000GB - $129 /month
PR|Jordan is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-31-2008, 04:17 PM   #12
Bama
Confirmed User
 
Join Date: Nov 2001
Location: Redmond, WA
Posts: 2,727
Thanks for the suggestions tho - very much appreciated
Bama is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks
Thread Tools



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.