GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Server Question (https://gfy.com/showthread.php?t=878657)

Bama 12-30-2008 07:46 PM

Server Question
 
I am soooo not a server guy but I need to get a list of any/all .php files on my server. There are lots of sub-directories and the pages on the server are not linked together.

Anyone have a php script that can do this? I don't want to have to exclude any files or folders by hand (excludes off the shelf solutions) and the print-out needs to write out the full url for me (such as)

http://www.domain.com/foo/foo1/foo.php
http://www.domain.com/foo2/foo1/foo/foo2.php

and not just foo.php and foo2.php

There's probably a simple /find *.php |grep all -print dummydoesntknowunix.txt command i could use too - so that would work as well

Help is appreciated!

munki 12-30-2008 07:48 PM

checkout googles python site map script... i've used it to scrap active pages a couple of times

skrinkladoo 12-30-2008 07:51 PM

maybe start simple via ssh

locate .php

or use a wildcard for the search.
I'am sure there are prettier ways ... my ssh is rusty, been spoiled w/ developers and admins for long time.

netpimp 12-30-2008 08:47 PM

find / -name "*.php" -print > /tmp/phpfilelist.txt

or

find / -name "*.php" -print | more


locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob.

sortie 12-30-2008 08:48 PM

Quote:

Originally Posted by Bama (Post 15263846)
I am soooo not a server guy but I need to get a list of any/all .php files on my server. There are lots of sub-directories and the pages on the server are not linked together.

Anyone have a php script that can do this? I don't want to have to exclude any files or folders by hand (excludes off the shelf solutions) and the print-out needs to write out the full url for me (such as)

http://www.domain.com/foo/foo1/foo.php
http://www.domain.com/foo2/foo1/foo/foo2.php

and not just foo.php and foo2.php

There's probably a simple /find *.php |grep all -print dummydoesntknowunix.txt command i could use too - so that would work as well

Help is appreciated!

Where are you hosted?

PR|Jordan 12-30-2008 09:48 PM

Ok, I am really not a tech - however, here is something i put together, remember to updatedb first.

Some edits will have to be done for you case :)
locate *.php | sed 's/\/var\/www\/html/www.domain.com/'

or do

locate *.php | sed 's/\/var\/www\/html/www.domain.com/' >> file.txt
I really wanted to use a recursive LS but couldn't figure out a way to save my life to pull from ls -RD lol

PR|Jordan 12-31-2008 09:05 AM

did that work for you?

million 12-31-2008 09:17 AM

Quote:

Originally Posted by netpimp (Post 15263993)
find / -name "*.php" -print > /tmp/phpfilelist.txt

or

find / -name "*.php" -print | more


locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob.

The binary is called "updatedb". It updates the slocate database, which the binary "locate" uses later.

PR|Jordan 12-31-2008 01:22 PM

Quote:

Originally Posted by netpimp (Post 15263993)
find / -name "*.php" -print > /tmp/phpfilelist.txt

or

find / -name "*.php" -print | more


locate would work above too, provided your locate database update script has ran, which is usually a middle of the night type of cronjob.

This would just give him the file locations not the list like he was needing - gotta use a sed in there.

Bama 12-31-2008 03:45 PM

Actually, tech support over at YellowFiber did it for me!

find / -name '*.php' > list.txt

This gave me the list as /user/home/domain.com/foo/foo.php then I did a search/replace /user/home/ with http:// and that gave me the list in the format I needed :)

PR|Jordan 12-31-2008 03:53 PM

But it doesnt look as cool as:

locate *.php | sed 's/\/var\/www\/html/www.domain.com/' >> file.txt

But i am glad you got it working :)

Bama 12-31-2008 04:17 PM

Thanks for the suggestions tho - very much appreciated


All times are GMT -7. The time now is 05:32 AM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc