Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 01-13-2025, 03:44 PM   #1
Publisher Bucks
Confirmed User
 
Industry Role:
Join Date: Oct 2018
Location: New Orleans, Louisiana. / Newcastle, England.
Posts: 1,123
Any idea why this cron job isnt doing its job?

It *was* working but stopped about a week ago, nothing has changed on the hosting account to my knowledge

Quote:
<?php

$directories = [
'/path/to/directory1',
'/path/to/directory2',
'/path/to/directory3'
'etc...'
'etc...'
];

$threshold = 30 * 24 * 60 * 60; // 30 days

foreach ($directories as $directory) {
if (is_dir($directory)) {
$dir = opendir($directory);

while (($file = readdir($dir)) !== false) {
if (pathinfo($file, PATHINFO_EXTENSION) === 'jpg') {
$filePath = $directory . DIRECTORY_SEPARATOR . $file;

if (file_exists($filePath) && time() - filemtime($filePath) > $threshold) {
unlink($filePath);
echo "Deleted: $filePath\n";
}
}
}

closedir($dir);
} else {
echo "Directory does not exist: $directory\n";
}
}

echo "Cleanup complete.\n";
?>
__________________
SOMETHING EXTREME IS COMING SOON!
Publisher Bucks is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 07:07 PM   #2
sarettah
see you later, I'm gone
 
Industry Role:
Join Date: Oct 2002
Posts: 14,057
not enough info. could be permissions, could be the cron setup, etc.

what does it show if you run it from the browser?
__________________
All cookies cleared!
sarettah is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 07:12 PM   #3
Pipecrew
Master of Gfy.com
 
Pipecrew's Avatar
 
Industry Role:
Join Date: Feb 2002
Posts: 14,885
Your missing commas in the directories array. Try this

<?php

$directories = [
'/path/to/directory1',
'/path/to/directory2',
'/path/to/directory3',
// Add more directories as needed
];

$threshold = 30 * 24 * 60 * 60; // 30 days in seconds

foreach ($directories as $directory) {
if (is_dir($directory)) {
$dir = opendir($directory);

if ($dir) {
while (($file = readdir($dir)) !== false) {
// Check if the file is a JPG
if (pathinfo($file, PATHINFO_EXTENSION) === 'jpg') {
$filePath = $directory . DIRECTORY_SEPARATOR . $file;

// Check if the file exists and is older than the threshold
if (file_exists($filePath) && time() - filemtime($filePath) > $threshold) {
if (unlink($filePath)) {
echo "Deleted: $filePath\n";
} else {
echo "Failed to delete: $filePath\n";
}
}
}
}

closedir($dir);
} else {
echo "Failed to open directory: $directory\n";
}
} else {
echo "Directory does not exist: $directory\n";
}
}

echo "Cleanup complete.\n";
?>
Pipecrew is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 07:18 PM   #4
Publisher Bucks
Confirmed User
 
Industry Role:
Join Date: Oct 2018
Location: New Orleans, Louisiana. / Newcastle, England.
Posts: 1,123
Quote:
Originally Posted by sarettah View Post
not enough info. could be permissions, could be the cron setup, etc.

what does it show if you run it from the browser?
It says its completed the cleanup, but images that are 45 days old are still present.

I have a ticket into the host to see if they've changed anything, no reply yet though :/
__________________
SOMETHING EXTREME IS COMING SOON!
Publisher Bucks is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 07:18 PM   #5
Publisher Bucks
Confirmed User
 
Industry Role:
Join Date: Oct 2018
Location: New Orleans, Louisiana. / Newcastle, England.
Posts: 1,123
Quote:
Originally Posted by Pipecrew View Post
Your missing commas in the directories array. Try this
That was a typo on my part posting and editing the code lol
__________________
SOMETHING EXTREME IS COMING SOON!
Publisher Bucks is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 07:23 PM   #6
WiredGuy
Pounding Googlebot
 
Industry Role:
Join Date: Aug 2002
Location: Canada
Posts: 34,451
Is there an output from your crontab? Usually it updates to mail on completion.
WG
__________________
I play with Google.
WiredGuy is online now   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 07:56 PM   #7
Publisher Bucks
Confirmed User
 
Industry Role:
Join Date: Oct 2018
Location: New Orleans, Louisiana. / Newcastle, England.
Posts: 1,123
Well I just heard back, apparently the host disabled cron because it was using up 'excessive resources' running daily and deleting a few hundred images every day from multiple directories after they were created

Currently emailing to get them to turn it back on LOL

Mystery solved, I do appreciate the assistance on this.

As a side note, is there a way I can make this quite simple code, use less resources?
__________________
SOMETHING EXTREME IS COMING SOON!
Publisher Bucks is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 08:27 PM   #8
WiredGuy
Pounding Googlebot
 
Industry Role:
Join Date: Aug 2002
Location: Canada
Posts: 34,451
Quote:
Originally Posted by Publisher Bucks View Post
Well I just heard back, apparently the host disabled cron because it was using up 'excessive resources' running daily and deleting a few hundred images every day from multiple directories after they were created

Currently emailing to get them to turn it back on LOL

Mystery solved, I do appreciate the assistance on this.

As a side note, is there a way I can make this quite simple code, use less resources?
Run it in smaller batches. Keep track of a counter, after processing X images, halt the loop and then run it again every hour instead of daily.
WG
__________________
I play with Google.
WiredGuy is online now   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-13-2025, 11:00 PM   #9
Killswitch
👏 REVOLUTIONARY 👏
 
Killswitch's Avatar
 
Industry Role:
Join Date: Oct 2012
Posts: 2,304
bruh you just let your host go on your servers you pay for and disable things without your permission?

YIKES.
__________________
Killswitch is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 01:02 PM   #10
mechanicvirus
Confirmed User
 
mechanicvirus's Avatar
 
Industry Role:
Join Date: Feb 2005
Location: Southern California
Posts: 3,735
Quote:
Originally Posted by Killswitch View Post
bruh you just let your host go on your servers you pay for and disable things without your permission?

YIKES.
Please dont kink shame
mechanicvirus is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 01:36 PM   #11
Killswitch
👏 REVOLUTIONARY 👏
 
Killswitch's Avatar
 
Industry Role:
Join Date: Oct 2012
Posts: 2,304
Quote:
Originally Posted by mechanicvirus View Post
Please dont kink shame
my bad b
__________________
Killswitch is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 02:31 PM   #12
mechanicvirus
Confirmed User
 
mechanicvirus's Avatar
 
Industry Role:
Join Date: Feb 2005
Location: Southern California
Posts: 3,735
Quote:
Originally Posted by Killswitch View Post
my bad b
Now that I think about it, you are 100% right... lol killing a cronjob because 15 people clicked an image and it "overloaded" a server.

What a world we live in.
mechanicvirus is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 03:08 PM   #13
Killswitch
👏 REVOLUTIONARY 👏
 
Killswitch's Avatar
 
Industry Role:
Join Date: Oct 2012
Posts: 2,304
Quote:
Originally Posted by mechanicvirus View Post
Now that I think about it, you are 100% right... lol killing a cronjob because 15 people clicked an image and it "overloaded" a server.

What a world we live in.
$65/mo managed vps with half a shared cpu and 256mb of ram. Those 15 clicks were a too much resource usage.
__________________
Killswitch is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 05:33 PM   #14
cerulean
Web & App Development
 
cerulean's Avatar
 
Industry Role:
Join Date: Oct 2023
Location: United States
Posts: 120
If I were you, I would avoid doing this in PHP (and I'm primarily a PHP developer.) PHP is certainly capable of this, but loading a programming language runtime just to run existing system commands does complicate the process.

If you're on Linux, you can use the find command and your host should be well-versed in how to do this anyway:

https://stackoverflow.com/a/69374901

If you're on Windows, similar things exist:

https://stackoverflow.com/a/51069

Either way, you should make backups of these files before deleting them, and you should capture the output of whatever you run, and send a notification on success. That way you know when it runs, and whether it's successful.
__________________
Cerulean Software Specializes in Website and App Development. Email me today!

Keep Your Business and Members Area Secure with LoginBlue Password and Content Protection
cerulean is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 06:47 PM   #15
Killswitch
👏 REVOLUTIONARY 👏
 
Killswitch's Avatar
 
Industry Role:
Join Date: Oct 2012
Posts: 2,304
Quote:
Originally Posted by cerulean View Post
If I were you, I would avoid doing this in PHP (and I'm primarily a PHP developer.) PHP is certainly capable of this, but loading a programming language runtime just to run existing system commands does complicate the process.

If you're on Linux, you can use the find command and your host should be well-versed in how to do this anyway:

https://stackoverflow.com/a/69374901

If you're on Windows, similar things exist:

https://stackoverflow.com/a/51069

Either way, you should make backups of these files before deleting them, and you should capture the output of whatever you run, and send a notification on success. That way you know when it runs, and whether it's successful.
For the record, the overhead of executing PHP to run file commands like unlink while is overhead, it's negligible and you shouldn't complicate things for yourself because someone else's preferences.
__________________
Killswitch is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 07:05 PM   #16
Publisher Bucks
Confirmed User
 
Industry Role:
Join Date: Oct 2018
Location: New Orleans, Louisiana. / Newcastle, England.
Posts: 1,123
Quote:
Originally Posted by WiredGuy View Post
Run it in smaller batches. Keep track of a counter, after processing X images, halt the loop and then run it again every hour instead of daily.
WG
Thanks Charles, LTNS, think we last chatted in Tempe at the Phoenix Forum with Aly Drummond!
__________________
SOMETHING EXTREME IS COMING SOON!
Publisher Bucks is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 07:07 PM   #17
Publisher Bucks
Confirmed User
 
Industry Role:
Join Date: Oct 2018
Location: New Orleans, Louisiana. / Newcastle, England.
Posts: 1,123
Wow, I’m so glad that people are getting offended on my behalf about a host shutting down a cron job on a $2 a month shared hosting plan that is only used for image generation… Classic GFY (and a good example of why society as a whole is failing).
__________________
SOMETHING EXTREME IS COMING SOON!
Publisher Bucks is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-14-2025, 07:54 PM   #18
cerulean
Web & App Development
 
cerulean's Avatar
 
Industry Role:
Join Date: Oct 2023
Location: United States
Posts: 120
Quote:
Originally Posted by Killswitch View Post
For the record, the overhead of executing PHP to run file commands like unlink while is overhead, it's negligible and you shouldn't complicate things for yourself because someone else's preferences.
This isn't the first time this has come up for me personally, and I stand by what I've said. The OP is using readdir, unlink, and pathinfo, and while you are focusing specifically on unlink, all three in tandem across multiple files can compound memory and resource issues. Without benchmarking it, I couldn't definitively say so, but my experience tells me that using find would be a lot more performant and would answer OPs request.

I don't know if the host is being honest, and I am not sure of the underlying system allocation, but there is truth to what I'm saying, and it's not necessarily negligible. PHP handles its own memory optimization and garbage collection, and there's not much you can do to make these more performant. It's already a very performant language. That doesn't mean it can handle large file operations with ease. It's not the right tool for the right job, in this case, in my opinion.

I was looking for some of the old threads and expert opinions I researched when I was attempting to optimize a phar executable for code obfuscation purposes, but I think most of those old forums are now defunct. My bookmarks were dead, but these SO posts shed some light on how file operations can be resource hogs in PHP, if you're interested in reading them. The bit on unlink not being asynchronous is very interesting.

https://stackoverflow.com/q/6627952

https://stackoverflow.com/q/37633680

https://stackoverflow.com/questions/...nous-functions

We can differ on this and have our own opinions, but I do stand by mine.

Quote:
Originally Posted by Publisher Bucks View Post
Wow, I’m so glad that people are getting offended on my behalf about a host shutting down a cron job on a $2 a month shared hosting plan that is only used for image generation… Classic GFY (and a good example of why society as a whole is failing).
I would not mind knowing the name of a managed shared hosting platform that offers $2/mo plans with cronjob support. That sounds like a diamond in the rough.
__________________
Cerulean Software Specializes in Website and App Development. Email me today!

Keep Your Business and Members Area Secure with LoginBlue Password and Content Protection
cerulean is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-18-2025, 06:33 PM   #19
natkejs
Confirmed User
 
Industry Role:
Join Date: Jan 2003
Location: Nomad Land
Posts: 1,603
Quote:
Originally Posted by Killswitch View Post
For the record, the overhead of executing PHP to run file commands like unlink while is overhead, it's negligible and you shouldn't complicate things for yourself because someone else's preferences.
It's not about the overhead of running unlink, it's about the directory listing which is exponentially faster using find.

Since the PHP script does nothing that a simple find command can't do, the cron should simply run the find command not a PHP script.

Something like:

find /path/1/ /path/2/ /path/3 -type f -mtime +30 -exec rm -f {} \;
__________________
natkejs is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-18-2025, 09:56 PM   #20
Killswitch
👏 REVOLUTIONARY 👏
 
Killswitch's Avatar
 
Industry Role:
Join Date: Oct 2012
Posts: 2,304
Quote:
Originally Posted by natkejs View Post
It's not about the overhead of running unlink, it's about the directory listing which is exponentially faster using find.

Since the PHP script does nothing that a simple find command can't do, the cron should simply run the find command not a PHP script.

Something like:

find /path/1/ /path/2/ /path/3 -type f -mtime +30 -exec rm -f {} \;
You're assuming this person is comfortable and familiar with the command line and the find command.

My assumptions are they know how to write PHP but that's about it.

And if you're optimizing your cronjobs there's a root issue not being solved.
__________________
Killswitch is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-18-2025, 10:16 PM   #21
natkejs
Confirmed User
 
Industry Role:
Join Date: Jan 2003
Location: Nomad Land
Posts: 1,603
Quote:
Originally Posted by Killswitch View Post
You're assuming this person is comfortable and familiar with the command line and the find command.

My assumptions are they know how to write PHP but that's about it.

And if you're optimizing your cronjobs there's a root issue not being solved.
Well yes, since OP is running on $2 shared hosting I'm assuming it's not managed, and if you can setup a cron for a PHP script you can also setup a cron running a find command.

If your task is to remove images older than 30 days, I don't see too much issues doing it this way.

It would be faster having the images and dates indexed to a ram db, but on $2 shared hosting I imagine your options are limited. Depending on the sizes of the directories using find like this should be just fine, and much much faster than relying on PHP.
__________________
natkejs is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 01-18-2025, 11:05 PM   #22
natkejs
Confirmed User
 
Industry Role:
Join Date: Jan 2003
Location: Nomad Land
Posts: 1,603
Ok I get it, he should pay for more resources and not worry about optimization of a cron job. But optimizing is so much fun, and cron jobs can require a gazillion % more CPU than necessary when not done right.

I know from experience having run more than a few WP installs on the same server. Shit ain't fun when wp-cron pops off on 1k domains at once.
__________________
natkejs is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks

Tags
job, changed, ago, hosting, account, crap, knowledge, week, cron, idea, stopped



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.