GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   How to manage a vps (https://gfy.com/showthread.php?t=1270315)

Barry-xlovecam 09-20-2017 05:03 PM

deny from 180.76.15.0/24

but I think that iptables or ufw firewalls are a better way to go than .htaccess

Baidu is velly sneeky ...


Code:

root@(none):~# dig ANY baidu.com

; <<>> DiG 9.7.3 <<>> ANY baidu.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 9342
;; flags: qr rd ra; QUERY: 1, ANSWER: 15, AUTHORITY: 0, ADDITIONAL: 2

;; QUESTION SECTION:
;baidu.com.                        IN        ANY

;; ANSWER SECTION:
baidu.com.                7200        IN        SOA        dns.baidu.com. sa.baidu.com. 2012136870 300 300 2592000 7200
baidu.com.                7200        IN        TXT        "v=spf1 include:spf1.baidu.com include:spf2.baidu.com include:spf3.baidu.com a mx ptr -all"
baidu.com.                7200        IN        TXT        "google-site-verification=GHb98-6msqyx_qqjGl5eRatD3QTHyVB6-xQ3gJB5UwM"
baidu.com.                7200        IN        MX        20 mx50.baidu.com.
baidu.com.                7200        IN        MX        10 mx.n.shifen.com.
baidu.com.                7200        IN        MX        20 mx1.baidu.com.
baidu.com.                7200        IN        MX        20 jpmx.baidu.com.
baidu.com.                539        IN        A        123.125.114.144
baidu.com.                539        IN        A        220.181.57.217
baidu.com.                539        IN        A        111.13.101.208
baidu.com.                86400        IN        NS        ns4.baidu.com.
baidu.com.                86400        IN        NS        ns7.baidu.com.
baidu.com.                86400        IN        NS        dns.baidu.com.
baidu.com.                86400        IN        NS        ns3.baidu.com.
baidu.com.                86400        IN        NS        ns2.baidu.com.

;; ADDITIONAL SECTION:
mx1.baidu.com.                300        IN        A        61.135.163.61
jpmx.baidu.com.                7200        IN        A        61.208.132.13

;; Query time: 252 msec
;; SERVER: 75.127.97.7#53(75.127.97.7)
;; WHEN: Wed Sep 20 19:56:06 2017
;; MSG SIZE  rcvd: 509


Barry-xlovecam 09-20-2017 06:35 PM

Quote:

Originally Posted by porn-update (Post 22007745)
I was thinking of installing Fail2ban, but I saw that it reads Apache errors.

But I think I didn't understand something because:
Code:

root@ubuntu-2gb-blr1-14-04-3:~# tail -f error_log|fgrep '[rewrite:'
tail: cannot open ‘error_log’ for reading: No such file or directory
root@ubuntu-2gb-blr1-14-04-3:~# LogLevel alert rewrite:trace3
LogLevel: command not found


What am I missing? How should I use this?

Sorry, long day -- that log's location /path/to/THAT/error_log
full path to THAT log not the domain config log.

Also pipe into the grep
$ tac <path/to/file/error.log> | egrep -i 'this|or|that' |less

porn-update 09-21-2017 08:50 AM

Thanks a lot for the answers

Unfortunately I see myself forced to postpone this thing because of 247host, which right now has suspended the account of my second server.

The one where I only keep testing sites or that make a few visits 10-200 range about, have it suspended for excessive resource consumption... :disgust
They are about 40 sites, but practically do nothing...

It is the second time in a week that compel me to buy a more expensive service.
They do not warn, do not give warnings, do not say anything, immediately suspend the account and send you an email with the link to buy a more expensive service...
Along with the account also die all mails... including those of work... :mad:

I would have changed short service, but I was hoping to be able to do everything quietly, and instead not...:mad:

They're really shit.




Anyway, leaving aside the anger...

I thought I would try Linode, so as not to have them all on Digitalocean, a small server but fully updated, with 16:04 or 17:04, PHP7 etc etc etc...

Particular indications or differences between Linode and Digitalocean?




P.S. As I am happy, I will spend the next week moving sites again... always because of these crappy services... :(

Barry-xlovecam 09-21-2017 09:55 AM

I use both.
I have used Linode for 9 or 10 years.

I think they are comparable.
I am going to try Leaseweb when I have time -- their pricing for small VPS is very good -- have to see if there are any issues ...

I think you have PHP code issues if you are exceeding your resources. error_log is a PHP error log name?

you might get some of the script errors in you use the php cli output at the terminal

Code:

cd path/
cd php <script name>.php

you may need to install php cli

porn-update 09-21-2017 10:09 AM

Linode does not accept my credit card...:(

So maybe Digitalocean for the moment, if I can get some mail, since my mails were in the account closed...:angrysoap

Today is really a shitty day :(

porn-update 09-21-2017 12:24 PM

So, I moved my sites with the mails here Freehostia and set the mails.

Now I receive all the mails of the world, except those with the password of the new droplet of Digitalocean:angrysoap

What do you think about vultr?

porn-update 09-21-2017 03:46 PM

In the end I managed to install a new droplet LAMP 16.04 in Digitalocean

The mails do not work yet, but I managed to access the new droplet via SSH key authentication and changed the password
(with some difficulty, because as usual in the guides of Digitalocean always missing a piece)

I installed pretty much all the lamp server fast enough (with all the times I threw away and restarted in 14.04, I have now learned)

I'm starting to upload my sites in a bit.

This time however I really need the mail, 2 on two different domains.

How to proceed? Install all a mail server on my droplet?
Postfix? Dovecot? Roundcube?

Are there any other alternatives? I just need to be able to configure them in my Thunderbird and receive and send mail.
And maybe have a place to see them online if I'm not at home pc...
And maybe if possible recover the backup of cpanel...

porn-update 09-22-2017 08:40 AM

Maybe Zoho?

Barry-xlovecam 09-22-2017 08:53 AM

You need access to your DNS records to change the MX entry.

You can enter a spf text record sending the email (MX) to some other location.

porn-update 09-22-2017 12:09 PM

So, I configured my mails with Zoho, with DNS, SPF etc, for the moment I am not very relaxed because I notice some strange behavior in the configuration phase through Digitalocean and I received some error messages, such as non-existent domain or relay disabled, I tried to send me mail through the same mail or to them, and not all the message arrived...

I hope it's DNS propagation issues, we wait a couple of days and see if everything starts to work...

Just in case, are there any other similar free services?

In the next few days I will to restore my sites, from 247host, after 2 days they gave me a backup that weighs 4.7 GB in which however missing about 40 sites... they are really crap...

porn-update 09-22-2017 01:11 PM

I have a problem, the MySQL server goes away...

Warning: PDOStatement:: Execute (): MySQL server has gone away in/var/www/html/xxxhashtag.com/sitemap_generator.php on line 88

I am reloading the sites from old backups, but I need to sync the sitemaps.

I have a script, which reads the new data in the database, adds links to the Sitemap, and updates the database by entering the value "Insitemap = 1", the next start reads only the data with value "Insitemap = 0".

Normally everything works, but now I have to sync everything in the Sitemap, about 600.000 lines...

The script will stay there quite a bit, then it returns errors of this type.

I have increased the limits on mysql.cnf and php.ini, but still does not work, what else can I do?

porn-update 09-25-2017 01:38 PM

So,
For the emails I solved with Yandex, it works... Zoho kept doing strange things...

For MySQL, I tried different configurations, several restarts etc, but I did not have great results.

I doubled the performance of the server and I installed a local lamp where I recreated the sitemaps that I then uploaded to the server.
For the moment I solved that.

I still can not run the files of the cronjob manually (the server remains stationary on a white page for hours), I created the cronjob with curl, let's see if it succeeds in completing the processing of the file or at least indicates some error...

I have already found a problem with version 5.7 of MySQL, to him are unsympathetic to the GROUP BY, for now I solved disabling "Sql_mode = only_full_group_by", I have too many queries to modify to do it now... in the future I update the sites.

Barry-xlovecam 09-25-2017 07:29 PM

my best thought it to run that script you are having all those issues with in a php cli

Code:

$ cd /path/to/script/
PHP Code:

<?php
/*comment out for production*/
//error_reporting(E_ALL);
//ini_set( 'display_errors', true );
/**********************************/

put full error reporting on
Code:

$ php scriptname.php
maybe you will get some troubleshooting clues ...

porn-update 09-26-2017 11:42 AM

It shows me the PHP code of the file... as did the cronjob on 14.04...

Weird, weird, weird...

Looking online the first (and perhaps only) solution you find is: You forgot to enable "Short_open_tag" on php.Ini. But it's the first thing I've activated..

porn-update 09-26-2017 01:22 PM

So,
To begin with I found this
Code:

2017-09-26T19:51:42.376416Z 678542 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:43.239533Z 678552 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:43.462009Z 678562 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:43.595315Z 678564 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed
2017-09-26T19:51:44.424527Z 678574 [ERROR] /usr/sbin/mysqld: Table './bigbreasthardpics/tgp_search' is marked as crashed and last (automatic?) repair failed

It probably messed things up a lot... I reloaded the table and now it seems to work.

Then I have decomposed my cronjob, good or bad almost all files seem to work, albeit slowly

But a particular file remains stationary for hours, is a file that synchronizes data between these sites: adulthashtag.com, xxxhashtag.com, porn-tags.com, Exchange each day the new rows added to the database and add each one that is missing.

The swap file is a .txt and weighs about 50kb and usually contains 1500-2000 new rows.

Each file is on its own site and is uploaded to others with a function like this:
Code:

$file = "Http://www.adulthashtag.com/new_query_search.txt";
foreach (File ($file) as $search _ Query_row) {

I doubt that there is some restriction of PHP or firewall that prevents the use of files outside the site and blocks everything for hours?

I have these in my syslog, UFW should be the firewall, but I did not understand if
Is he just doing his job or if he's blocking me???
Code:

Sep 26 06:31:34 ubuntu-1gb-nyc3-01 kernel: [36766.523428] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:08:30:08:00 SRC=210.146.241.198 DST=104.236.230.48 LEN=52 TOS=0x00 PREC=0x00 TTL=115 ID=17950 DF PROTO=TCP SPT=58001 DPT=8118 WINDOW=8192 RES=0x00 SYN URGP=0
Sep 26 06:31:55 ubuntu-1gb-nyc3-01 kernel: [36786.721018] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:18:30:08:00 SRC=138.201.19.161 DST=104.236.230.48 LEN=56 TOS=0x02 PREC=0x00 TTL=119 ID=14604 DF PROTO=TCP SPT=63002 DPT=8118 WINDOW=8192 RES=0x00 CWR ECE SYN URGP=0
Sep 26 06:32:15 ubuntu-1gb-nyc3-01 kernel: [36807.234634] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:18:30:08:00 SRC=46.161.9.49 DST=104.236.230.48 LEN=60 TOS=0x00 PREC=0x00 TTL=57 ID=13838 DF PROTO=TCP SPT=50016 DPT=8118 WINDOW=14600 RES=0x00 SYN URGP=0
Sep 26 06:32:34 ubuntu-1gb-nyc3-01 kernel: [36826.106792] [UFW BLOCK] IN=eth0 OUT= MAC=36:1a:36:97:ff:ba:84:b5:9c:f9:18:30:08:00 SRC=213.136.75.227 DST=104.236.230.48 LEN=60 TOS=0x00 PREC=0x00 TTL=53 ID=12944 DF PROTO=TCP SPT=47055 DPT=8118 WINDOW=29200 RES=0x00 SYN URGP=0

The cronjob should have been executed by now, but have not given signs of life... I try to install Postfix as the other time in 14.04... although this time I see no references to postfix in the logs, hopefully help

Barry-xlovecam 09-27-2017 10:19 AM

check those IPs;
$ whois ip

the 104.xxx is you server?

https://lists.torproject.org/piperma...ch/004159.html

https://lists.torproject.org/piperma...ch/004160.html

possibly ... the former IP user?

porn-update 09-27-2017 12:03 PM

Yes, and I usually use Tor, but not so often to fill the logs.

I thought... could it be yandex with emails?

porn-update 09-27-2017 12:19 PM

:angrysoap After the problem with the table Bigbreast I have taken a look at all the databases

I'm finding a lot of MySQL import errors from the backup of 247host.
Missing auto increment, primary key, default value, null value, etc etc etc.

Practically almost all the imported tables have problems...

The backup of 247host is really a crap...

Maybe it is the fault of MySQL problems if I find errors in the firewall, I do not receive news of the conjob, I can not start files etc etc etc???

porn-update 09-28-2017 01:28 PM

So, after correcting an infinity of problems on MySQL caused by importing, something seems to start working

Also conjobs and the problem of cross-site synchronization was also due to the missing of an auto increment field.

Now the idea is to leave the server free to run for a few days, and see what happens... Especially see if this chart normalizes.

http://porn-update.com/temp/Schermat...2022-26-54.png

Now it scares me...
but also in the other server in the first days consumed many more resources than its normal use, and he did not have all these problems on the MySQL tables.

Barry-xlovecam 09-28-2017 03:22 PM

These are not server problems they are software problems.
  1. I would back up the tables you have now
  2. then truncate the data in the tables
  3. correct any column type errors
  4. then repopulate the tables with new data using your PHP script
  5. if the cpu use is too high then -- you have some errors or a memory leak in your PHP script
.

porn-update 09-29-2017 11:55 AM

In the end I did just that, I installed a local lamp and recreated all the possible tables via script.

The imported ones were not recoverable, in many tables were missing the auto increment and primary fields, and the auto increment fields were filled with empty fields, missing numbers and zeros, etc etc etc., really a mess...

Recreating the tables locally and importing mine, everything seems much more correct.

Today it has also arrived the first report by mail of the cronjob of xxxhashtag.com.
But something maybe still does not work... The cronjob took 8800 seconds...:eek7 usually this cronjob takes 2...

Now I have to figure out if there's still something wrong or if he's just dealt with a heavy sync with other sites.
Unfortunately the backups I had were not up to date and Xxxhashtag works with many sites and many databases.
Usually it checks the data of the last 24 hours, but having restored the databases, probably many data have been added in the last 24 hours...

We will see in the next few days...

porn-update 09-29-2017 02:40 PM

Looking through the tables I found a lot of duplicate indexes that I did not create...

Code:

CREATE TABLE IF NOT EXISTS `xxxhashtag_search` (
  `id_search` int(11) NOT NULL AUTO_INCREMENT,
  `query` varchar(255) NOT NULL,
  `views` int(11) NOT NULL DEFAULT '1',
  `insitemap` int(1) NOT NULL DEFAULT '0',
  `insitemap_link` int(1) NOT NULL DEFAULT '0',
  `insitemap_link2` int(1) NOT NULL DEFAULT '0',
  `data_ins` varchar(255) NOT NULL DEFAULT '1388796621',
  `last_mod` varchar(255) DEFAULT '1415144202',
  `engine` varchar(255) NOT NULL,
  PRIMARY KEY (`id_search`),
  KEY `query` (`query`),
  KEY `query_2` (`query`),
  KEY `query_3` (`query`),
  KEY `query_4` (`query`),
  FULLTEXT KEY `query_5` (`query`),
  FULLTEXT KEY `query_6` (`query`)
) ENGINE=MyISAM  DEFAULT CHARSET=latin1 AUTO_INCREMENT=925373 ;

By removing the indexes in addition the script from 8000 seconds now takes 0.2 seconds...

Now I check all the other tables looking for duplicate indexes...
(I will take about 3 days, as I am happy... :Oh crap)

porn-update 10-02-2017 02:00 PM

And now...

Everything seems to begin to work... and begin to come to mind questions like:

Can I make some backups?
Would only the databases (all files I can restore from the databases), would weekly, but the procedure should not weigh too much on the server (some MySQL databases now weigh 60-70 MB of data and the server is small).

Maybe it would also be nice if the databases were sent somewhere, like my pc, or Yandex disk, or a gigamail, or something like that... just to be sure that if the server goes on fire them are somewhere else...:upsidedow but anyway all this should not abuse server resources too much.


Then, what will be the fastest way to move a site to another server? (Always unmanaged)

I was a little bit the desire to Linode or maybe vultr (although I asked, but I did not understand if Vultr accepts adult)

For quite a while my sites will remain here, after all the effort I made, however I wonder, if I wanted to move a site I have to reload one by one all the databases via phpMyAdmin and via FTP all files?
Exist a faster, more practical and more secure way?

Barry-xlovecam 10-03-2017 07:19 AM

SSH

Warning Will Robinson: Use mysql as root user so you can lock the tables

make a directory for your mysql backups
cd to that directory you make

Code:

#!use root to locktables
$ mysqldump --add-drop-table  -u root  -p [DATABASE NAME] >[DATABASE NAME].backup.$(date +%F).sql
Enter password:

/home/user/****.com/****/[DATABASE NAME].backup.2017-09-24.sql
is made^^
use scp or rsync to move the backup to other locations

You will have to

create the database user and grant permissions as needed

CREATE [DATABASE NAME];

then read this
https://stackoverflow.com/questions/...-line-in-mysql

porn-update 10-03-2017 02:57 PM

So...
I have taken a look and tried to understand everything.

To restore I understood, because I had to use the command line to load those damn corrupted databases, otherwise via phpMyAdmin I could not import them.

For backup I have a few questions:
Can I create a single file to use in crontab?
Eg. .sh executable... (I do not know what they are called or what they are, but I happened to make someone on my PC...) to create a single file with all the commands in order to backup all the databases and use it in a cronjob?

And launch this file once a week or once a month via crontab?
It depends on how many resources it consumes by launching all backups together.

I did not know SCP, but I like it... So much, I have a little raspberry PI attached to my router that spends the day making backups of my data between my PCs, the mobile phone, the tablet and some cloud services.

So if I can take SCP backups of the databases on my raspberry, he save them anywhere...

The backup files in order to be "taken" with SCP must be in home/user/?
Also if ssh logs in as root?






------------------------------------------------------------------
Other little thing out of topic but it came a minute ago...

Doing
Code:

sudo apt-get update
sudo apt-get upgrade

In 14.04 the server says this:
Code:

Processing triggers for libapache2-mod-php5.6 (5.6.31-6+ubuntu14.04.1+deb.sury.org+1) ...
Processing triggers for php5.6-fpm (5.6.31-6+ubuntu14.04.1+deb.sury.org+1) ...
php5.6-fpm stop/waiting
php5.6-fpm start/running, process 26417
NOTICE: Not enabling PHP 5.6 FPM by default.
NOTICE: To enable PHP 5.6 FPM in Apache2 do:
NOTICE: a2enmod proxy_fcgi setenvif
NOTICE: a2enconf php5.6-fpm
NOTICE: You are seeing this message because you have apache2 package installed.
php5.6-fpm stop/waiting
php5.6-fpm start/running, process 26465

Should I do that?
I installed PHP 5.5 along with the lamp, then PHP 5.6 later, but I don't remember ever asking for FPM...
Better to use FPM on the 14.04? Is fpm now obsolete? If I enable it I have to remake hp.ini configuration, re-enable opcache, etc etc etc.?

porn-update 10-04-2017 03:51 PM

I found this script, almost perfect for what I want to do

Code:

#!/bin/bash
# Shell script to backup MySql database
# To backup Nysql databases file to /backup dir and later pick up by your
# script. You can skip few databases from backup too.
# For more info please see (Installation info):
# http://www.cyberciti.biz/nixcraft/vivek/blogger/2005/01/mysql-backup-script.html
# Last updated: Aug - 2005
# --------------------------------------------------------------------
# This is a free shell script under GNU GPL version 2.0 or above
# Copyright (C) 2004, 2005 nixCraft project
# -------------------------------------------------------------------------
# This script is part of nixCraft shell script collection (NSSC)
# Visit http://bash.cyberciti.biz/ for more information.
# -------------------------------------------------------------------------
 
MyUSER=username    # USERNAME
MyPASS=password  # PASSWORD
MyHOST=hostname        # Hostname
 
# Linux bin paths, change this if it can't be autodetected via which command
MYSQL="$(which mysql)"
MYSQLDUMP="$(which mysqldump)"
CHOWN="$(which chown)"
CHMOD="$(which chmod)"
GZIP="$(which gzip)"
 
# Backup Dest directory, change this if you have someother location
DEST="/var/backup"
 
# Main directory where backup will be stored
MBD="$DEST/mysql"

#elimino vecchi backup
rm -f $MBD/*
 
# Get hostname
HOST="$(hostname)"
 
# Get data in dd-mm-yyyy format
NOW="$(date +"%d-%m-%Y")"
 
# File to store current backup file
FILE=""
# Store list of databases
DBS=""
 
# DO NOT BACKUP these databases
IGGY="information_schema cond_instances mysql performance_schema phpmyadmin"
 
[ ! -d $MBD ] && mkdir -p $MBD || :
 
# Only root can access it!
$CHOWN 0.0 -R $DEST
$CHMOD 0600 $DEST
 
# Get all database list first
DBS="$($MYSQL -u $MyUSER -h $MyHOST -p$MyPASS -Bse 'show databases')"
 
for db in $DBS
do
    skipdb=-1
    if [ "$IGGY" != "" ];
    then
        for i in $IGGY
        do
            [ "$db" == "$i" ] && skipdb=1 || :
        done
    fi
 
    if [ "$skipdb" == "-1" ] ; then
        FILE="$MBD/$db.$HOST.$NOW.gz"
        #no gzip, comprimo dopo tutta la cartella
        FILE="$MBD/$db.$HOST.$NOW.sql"

        # do all inone job in pipe,
        # connect to mysql using mysqldump for select mysql database
        # and pipe it out to gz file in backup dir :)
        #$MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db | $GZIP -9 > $FILE

        #no gzip, comprimo dopo tutta la cartella
        $MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db > $FILE
    fi
done

#comprimo tutto
zip -r $DEST/mysql_backup.$HOST.zip $MBD/

#tar -zcvf $DEST/mysql_backup.$HOST.tar.gz $MBD

I added this to delete last week's backups
Code:

#elimino vecchi backup
rm -f $MBD/*

I added the system databases among those excluded from the process
Code:

# DO NOT BACKUP these databases
IGGY="information_schema cond_instances mysql performance_schema phpmyadmin"

Removed the compression on mysqldump
Code:

#FILE="$MBD/$db.$HOST.$NOW.gz"
        #no gzip, comprimo dopo tutta la cartella
        FILE="$MBD/$db.$HOST.$NOW.sql"

        # do all inone job in pipe,
        # connect to mysql using mysqldump for select mysql database
        # and pipe it out to gz file in backup dir :)
        #$MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db | $GZIP -9 > $FILE

        #no gzip, comprimo dopo tutta la cartella
        $MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db > $FILE

Added to the end compression of the entire /mysql folder
Code:

#comprimo tutto
zip -r $DEST/mysql_backup.$HOST.zip $MBD/

#tar -zcvf $DEST/mysql_backup.$HOST.tar.gz $MBD

From all this comes a single zipped file, which in theory, I should be able to download with SCP on my raspberry

Crontab? Only this code?
Code:

0 6 * * 4 /var/backup/mysql_backup
Will I receive some output via mail? Maybe the time elapsed to run the script?

I hope I didn't make any big errors... I added, edited, deleted lines, but I practically have no idea what the language with which the file is written... looks like PHP, it seems to work again... it's the best I can say... it would be nice if you would you warn me if I made some horrendous error :helpme

One strange thing I saw is that in the compressed file I find the folder structure /var/backup/mysql, while I was expecting only /mysql, not a big problem, but strange...

Now I try to bring everything on my raspberry via SPC, hopefully everything works.

The next and last step will be to understand when to start all to not create problems to the server

Barry-xlovecam 10-04-2017 06:37 PM

rm -f is a bad idea if you do not need it

plain rm is fine
Make a copy like I told you manually
then test your script manually have a plan b

try adding at the bottom of your script
Code:

$ echo "`date` backup done"
this will print this out
Wed Oct 4 21:28:16 EDT 2017 backup done

add this line to your cron and check what happened in the morning :)
Code:

0 6 * * 4 (cd /var/backup/mysql_backup/; ./backup_script_name.sh) | mail -s "subject backup done" [email protected]

porn-update 10-05-2017 01:15 PM

Thank you
I tried everything, and it works!!! :thumbsup

Even the mail arrives... without telling him anything...
(probably because of the link cronjob and postfix, now comes anything has an output)

The output that arrives is the result of zip, but it is OK, at least tells me if it did something and if it worked.

Only strange thing, 16.04 complains a bit about this:
Code:

mysqldump: [Warning] Using a password on the command line interface can be insecure.
But I think I don't have many other alternatives

Rather than the date, can I have the time spent by the script?
To understand how long it used the server.
In php I usually put a time() at the beginning of the script and one at the end and calculating the difference, but here I do not know how to do...:(

Barry-xlovecam 10-05-2017 02:20 PM

Code:

$ barry@paragon-DS-7:~$ echo `date +%s`;

sleep 3; #script code here

echo `date +%s`;

outputs;

barry@paragon-DS-7:~$ echo `date +%s`; sleep 3; echo `date +%s`;
1507240274
1507240277

in seconds since epoch (just subtract the values (reversed))


If you don't have a MAIL_TO=
at the top of your crontab you have to state it (or the right email address) in the cron itself.

The password warning is for security. This is not done over the internet so it is a root cron? Well if you can't trust root locally on your server --- reformat fast!

Quote:

Data can be encrypted in the command channel, the data channel, or ideally, both. SCP: Secure Copy, or SCP, does not use FTP or SSL to transfer files, rather Secure Copy handles the file transfer and relies on the SSH protocol to provide authentication and security for both credentials and data.
Unless, you are sending financial data or state secrets, I really would not worry sending a password SCP.

porn-update 10-06-2017 12:55 PM

OK, almost...

Excuse my stupidity, but this programming language sounds strange to me.

So, is that it?

Code:

#At the beginning of the script, this:
STARTTIME = date +%s

#My script

#At the last line, this:
ENDTIME = date +%s
echo $ENDTIME - $STARTTIME;

I missed some ";", some "$", something else?

Sorry if it seems trivial and stupid, but from the script I can hardly understand how to do even the dumbest things.
Compared to my world in PHP I miss "$", ";", I do not understand why the variables are all written in uppercase and how the lines end...
I think I understand that everything still works in cascade and that without $ I define a variable, while with the $ I read and use it, but of everything else I'm not sure...

Just to understand, what programming language is this?



Quote:

Unless, you are sending financial data or state secrets, I really would not worry sending a password SCP.
Here the only ones to have a secret are the transsexuals:winkwink:

Barry-xlovecam 10-06-2017 03:25 PM

no in bash.sh the ; at the end of a statement is not needed
var=something (declaration)
like JavaScript
$var beneath (a declared variable)
like
echo $var

the caps are what i did they could be in lowercase too -- but bash .sh is case sensitive

in a terminal

$ dothis; sothat; is this; && dosomethindgood | (<pipe>) to the next statement

porn-update 10-09-2017 01:19 PM

So,
I did that...
Code:

#!/bin/bash
# Shell script to backup MySql database
# To backup Nysql databases file to /backup dir and later pick up by your
# script. You can skip few databases from backup too.
# For more info please see (Installation info):
# http://www.cyberciti.biz/nixcraft/vivek/blogger/2005/01/mysql-backup-script.html
# Last updated: Aug - 2005
# --------------------------------------------------------------------
# This is a free shell script under GNU GPL version 2.0 or above
# Copyright (C) 2004, 2005 nixCraft project
# -------------------------------------------------------------------------
# This script is part of nixCraft shell script collection (NSSC)
# Visit http://bash.cyberciti.biz/ for more information.
# -------------------------------------------------------------------------
STARTTIME=date +%s

MyUSER=root    # USERNAME
MyPASS=Alfarenna79  # PASSWORD
MyHOST=localhost        # Hostname
 
# Linux bin paths, change this if it can't be autodetected via which command
MYSQL="$(which mysql)"
MYSQLDUMP="$(which mysqldump)"
CHOWN="$(which chown)"
CHMOD="$(which chmod)"
GZIP="$(which gzip)"
 
# Backup Dest directory, change this if you have someother location
DEST="/var/backup"
 
# Main directory where backup will be stored
MBD="$DEST/mysql"

#elimino vecchi backup
rm $MBD/*
 
# Get hostname
HOST="$(hostname)"
 
# Get data in dd-mm-yyyy format
NOW="$(date +"%d-%m-%Y")"
 
# File to store current backup file
FILE=""
# Store list of databases
DBS=""
 
# DO NOT BACKUP these databases
IGGY="information_schema cond_instances mysql performance_schema phpmyadmin"
 
[ ! -d $MBD ] && mkdir -p $MBD || :
 
# Only root can access it!
$CHOWN 0.0 -R $DEST
$CHMOD 0600 $DEST
 
# Get all database list first
DBS="$($MYSQL -u $MyUSER -h $MyHOST -p$MyPASS -Bse 'show databases')"
 
for db in $DBS
do
    skipdb=-1
    if [ "$IGGY" != "" ];
    then
        for i in $IGGY
        do
            [ "$db" == "$i" ] && skipdb=1 || :
        done
    fi
 
    if [ "$skipdb" == "-1" ] ; then
        #FILE="$MBD/$db.$HOST.$NOW.gz"
        #no gzip, comprimo dopo tutta la cartella
        FILE="$MBD/$db.$HOST.$NOW.sql"

        # do all inone job in pipe,
        # connect to mysql using mysqldump for select mysql database
        # and pipe it out to gz file in backup dir :)
        #$MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db | $GZIP -9 > $FILE

        #no gzip, comprimo dopo tutta la cartella
        $MYSQLDUMP -u $MyUSER -h $MyHOST -p$MyPASS $db > $FILE
    fi
done

#comprimo tutto
zip -r $DEST/mysql-backup-$HOST.zip $MBD/

#tar -zcvf $DEST/mysql-backup-$HOST.tar.gz $MBD

ENDTIME=date +%s

TOTTIME=$ENDTIME-$STARTTIME

echo Elapsed_time: $TOTTIME

But it tells me
Code:

/var/backup/mysql_backup: line 15: +%s: command not found
It seems that he does not like the +%s, but I really do not know how to solve it...:(

Barry-xlovecam 10-10-2017 05:03 AM

STARTTIME=(`date +%s`)

try like this and the time will be in epoch seconds

ENDTIME=(`date +%s`)

porn-update 10-10-2017 02:52 PM

It works, but the result is kinda odd...

Code:

Elapsed_time: 1507671766-1507671705
Practically he tells me: the calculation made you :Oh crap
I tried to put quotes, parentheses etc etc, but he does not want to do it... can we do this last thing too?

It takes more time for this little thing than to configure all the server... :)

Barry-xlovecam 10-10-2017 10:15 PM

Maybe in $()
TOTTIME=$($ENDTIME-$STARTTIME)

Do the math -- the sum is in seconds :)

barry@paragon-DS-7:~$ bc <<< 1507671766-1507671705
61
seconds

porn-update 10-11-2017 02:30 PM

Says:
Code:

/var/backup/mysql_backup: line 93: 1507756208-1507756139: command not found
But looking for "bc" (which I did not know), I found this:
Code:

TOTTIME=`expr $ENDTIME - $STARTTIME`
This seems to work :thumbsup

Now I'm worried about those odd quotes...
In PHP when I find those quotes it means that there was a copy paste error from the HTML and nothing works anymore.

So I have the habit of removing them as soon as I see them and replace them with a normal apex... in sh instead it seems to be fundamental... I surely have removed someone thinking they were a error...:error

I shouldn't have done any damage, because everything seems to work, but maybe I'm going to look for the original script and I see if there was someone...:upsidedow

P.S. It's strange how we can install an entire server, and then the simplest things make us crazy...:)

Barry-xlovecam 10-11-2017 03:03 PM

these are called backticks

bc is a terminal calculator program

apt install bc

man bc

Quote:

The backtick (``) is actually called command substitution. The purpose of command substitution is to evaluate the command which is placed inside the backtick and provide its result as an argument to the actual command. The command substitution can be done in two ways one is using $() and the other is "``" .Dec 22, 2011
look above^^

Quote:

STARTTIME=(`date +%s`)

try like this and the time will be in epoch seconds

ENDTIME=(`date +%s`)
I just habitually close an expression in () for clarity in my code -- it is probably in most cases superfluous (but benign)
Like
our @array =(<FILENAME>);

porn-update 10-12-2017 02:34 PM

There's one last thing that scares me a lot

Load
http://porn-update.com/temp/Schermat...2023-13-39.png

Much of that red is due to the phase of moving sites and all the importing error of those damned databases.

Also the other server at the beginning had very red, then slowly it is normalized.

This is taking a little more...

But what sounds strange to me is that going to see the detail of the server, you do not understand why there is all that red.

http://porn-update.com/temp/Schermat...2023-15-54.png

The CPU rarely arrives at 90%, the memory is a bit chubby but it works, disk there is plenty, errors or special problems there are none...
The sites are running well, fast, without interruptions, or visible slowdowns...

Cpu sometimes says "stolen" even if it is working maybe at 70%, and already this is odd.

But it is the usual load to give more worries, sometimes even 4-5, I also saw 7 in the days of cronjob (they are still synchronizing many data due to the lack of cronjob in the other servers)

What does it actually indicate load?

And how much do I have to worry?

On a scale it goes from "quiet, goes all right" to "shit the server is going to explode, run away all before it's late, shit we'll die all :eyecrazy", where am I?

Barry-xlovecam 10-12-2017 06:58 PM

Your problem is your PHP script and the MySQL daemon (server). Software for your application; or,
If you look at the times of the peak usage and grep those times in the server access logs you may find that bing is indexing too many pages too fast -- you can place a directive in the robots.txt
User-agent: bingbot
Crawl-delay:$v
5
10

see
https://www.siteground.com/kb/how_to..._eng ine_bot/
https://www.bing.com/webmaster/help/...ntrol-55a30302

Slow bing down -- don't Disallow Bing they bring good converting traffic and the sell their PSaaS or indexed database to Yahoo and other search engines.

You may find Baidu is indexing too many pages too fast -- block them at your firewall I have had luck that way
Porn is illegal in China and you won't sell legit Chinese buyers either.
# Free IP2Location Firewall List by Search Engine
# Source: Whitelist Robots by Search Engine | IP2Location

Code:

whois -h v4.whois.cymru.com " -c -p 183.131.32.0/20"
AS      | IP              | BGP Prefix          | CC | AS Name
4134    | 183.131.32.0    | 183.128.0.0/11      | CN | CHINANET-BACKBONE No.31,Jin-rong Street, CN
 -c -p 12.1.72.32/27"
7018    | 12.1.72.32      | 12.0.0.0/9          | US | ATT-INTERNET4 - AT&T Services, Inc., US
" -c -p 104.193.88.0/22"
55967  | 104.193.88.0    | 104.193.88.0/24    | US | CNNIC-BAIDU-AP Beijing Baidu Netcom Science and Technology Co., Ltd., CN

or use Ruby gem nicinfo
https://github.com/arineng/nicinfo

that will give you full RDAP/whois information

Third way is just $ whois <ip address>

If you are generating many dynamic pages search engines may be causing this problem

Scrapers and *bad bots* may be the issue too.

This is what server logs are for to search for problems and find patterns.
A firewall is the way to go -- just do not answer -- drop the packet.

porn-update 10-16-2017 03:40 PM

But Holy cow :angrysoap

I was away 2 days and the server was invaded by bots, just like you said...

http://porn-update.com/temp/Schermat...2000-34-49.png

Code:

51.255.65.66 - - [16/Oct/2017:22:25:31 +0000] "GET /27 HTTP/1.1" 302 3634 "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)"
157.55.39.234 - - [16/Oct/2017:22:25:09 +0000] "GET /cimla+sexy+photos.com/ HTTP/1.1" 200 32929 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
93.105.187.11 - - [16/Oct/2017:22:25:05 +0000] "GET /search.php?q=shemale+mia+isabella+teacher+her+student+a+lesson+free+porn&sort=date&page=5 HTTP/1.1" 200 10438 "http://www.bigbigbigboobs.com/search.php?q=shemale+mia+isabella+teacher+her+student+a+lesson+free+porn&sort=date" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
216.244.66.245 - - [16/Oct/2017:22:25:30 +0000] "GET /search-amy+anderssen+photos+pk/ HTTP/1.1" 200 79767 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])"
207.46.13.86 - - [16/Oct/2017:22:25:24 +0000] "GET /search-bigboob+s+saree+woman+photo+pk/ HTTP/1.1" 200 24426 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
::1 - - [16/Oct/2017:22:25:31 +0000] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.4.18 (Ubuntu) OpenSSL/1.0.2g (internal dummy connection)"
66.249.64.3 - - [16/Oct/2017:22:25:16 +0000] "GET /love+sex+move/ HTTP/1.1" 200 34386 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
40.77.167.14 - - [16/Oct/2017:22:25:31 +0000] "GET /search-bbw+back+sid+girl+photos.com/ HTTP/1.1" 200 24797 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
66.249.70.19 - - [16/Oct/2017:22:25:32 +0000] "GET /74277 HTTP/1.1" 200 19774 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.70.30 - - [16/Oct/2017:22:25:32 +0000] "GET /savita+babhi/ HTTP/1.1" 200 20327 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
root@ubuntu-1gb-nyc3-01:~# tail /var/log/apache2/access.log
8.37.233.40 - - [16/Oct/2017:22:26:54 +0000] "GET /download+video+bokep+jepang+rina+araki/ HTTP/1.1" 200 29346 "https://www.google.co.id/search?client=ucweb-b-bookmark&q=video+ngentot+rina+araki&oq=video+ngentot+rina+araki&aqs=mobile-gws-lite.." "Mozilla/5.0 (Linux; U; Android 6.0.1; en-US; SM-G532G Build/MMB29T) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 UCBrowser/11.3.5.972 U3/0.8.0 Mobile Safari/534.30"
49.34.127.70 - - [16/Oct/2017:22:26:56 +0000] "GET /xvillage+desi+8+saal+ki+bachi+ki+chudai+video/ HTTP/1.1" 200 31479 "android-app://com.google.android.googlequicksearchbox" "Mozilla/5.0 (Linux; Android 5.1.1; SM-J200G Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.91 Mobile Safari/537.36"
93.105.187.11 - - [16/Oct/2017:22:26:30 +0000] "GET /page-17/search-googleweblight.comlite_url+2+mom+big+naked+milky+boobs+images.com/date/ HTTP/1.1" 200 26231 "http://www.monsterboobshardpics.com/page-14/search-googleweblight.comlite_url+2+mom+big+naked+milky+boobs+images.com/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
216.244.66.228 - - [16/Oct/2017:22:27:08 +0000] "GET /search-big+assas+larag+ass+masive+ass+huge+cock+large+cock+hardcore+anal+gp+download+free/ HTTP/1.1" 200 97124 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])"
93.105.187.11 - - [16/Oct/2017:22:26:18 +0000] "GET /page-7/search-desi+bhabi+sexy+boob+press+fuck+pussy+mp+mobile+ipone/date/ HTTP/1.1" 200 24021 "http://www.monsterboobshardpics.com/search-desi+bhabi+sexy+boob+press+fuck+pussy+mp+mobile+ipone/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
93.105.187.11 - - [16/Oct/2017:22:26:23 +0000] "GET /page-14/search-boobs+milk+breathing+imeges/date/ HTTP/1.1" 200 24406 "http://www.monsterboobshardpics.com/page-9/search-boobs+milk+breathing+imeges/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
207.46.13.183 - - [16/Oct/2017:22:27:07 +0000] "GET /page-15/search-african+black+pussy/ HTTP/1.1" 200 24382 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
202.46.58.190 - - [16/Oct/2017:22:27:06 +0000] "GET /search-big+black+fatty+boom+shemale+fuck/ HTTP/1.1" 200 24940 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
51.255.65.27 - - [16/Oct/2017:22:27:09 +0000] "GET /36331 HTTP/1.1" 200 19518 "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)"
93.105.187.11 - - [16/Oct/2017:22:26:08 +0000] "GET /page-12/search-tite+big+round+heavy+boobs+hd+pics/date/ HTTP/1.1" 200 25087 "http://www.monsterboobshardpics.com/page-7/search-tite+big+round+heavy+boobs+hd+pics/date/" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
root@ubuntu-1gb-nyc3-01:~# tail /var/log/apache2/access.log
216.244.66.228 - - [16/Oct/2017:22:27:24 +0000] "GET /search-bbw+big+hips+mom+churidar+hot+photo+xxxin/ HTTP/1.1" 200 92852 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])"
46.229.168.79 - - [16/Oct/2017:22:27:25 +0000] "GET /52199 HTTP/1.1" 200 19569 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)"
93.105.187.11 - - [16/Oct/2017:22:27:06 +0000] "GET /search.php?q=leanne+crow+huge+boobs+fake&page=5 HTTP/1.1" 200 10927 "http://www.bigbigbigboobs.com/search.php?q=leanne+crow+huge+boobs+fake" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
93.105.187.11 - - [16/Oct/2017:22:27:02 +0000] "GET /search.php?q=windian+bhabi+tight+salwar+gand+penty+showing+sexy+pic&page=2 HTTP/1.1" 200 10397 "http://www.bigbigbigboobs.com/search.php?q=windian+bhabi+tight+salwar+gand+penty+showing+sexy+pic" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
::1 - - [16/Oct/2017:22:27:26 +0000] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.4.18 (Ubuntu) OpenSSL/1.0.2g (internal dummy connection)"
157.55.39.77 - - [16/Oct/2017:22:27:24 +0000] "GET /page-13/search-pornstar+aunty+sex+videos+downloadiporntv.net/ HTTP/1.1" 200 26726 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
216.244.66.233 - - [16/Oct/2017:22:27:24 +0000] "GET /303/ HTTP/1.1" 200 79410 "-" "Mozilla/5.0 (compatible; DotBot/1.1; http://www.opensiteexplorer.org/dotbot, [email protected])"
93.105.187.11 - - [16/Oct/2017:22:27:16 +0000] "GET /?q=face+book+hot+nice+aunty+xxx+back+side+imagedate/ HTTP/1.1" 200 9673 "http://www.bigbigbigboobs.com/search.php?q=face+book+hot+nice+aunty+xxx+back+side+imagedate&page=6" "Mozilla/5.0 (X11; Linux x86_64; rv:30.0) Gecko/20100101 Firefox/30.0"
40.77.167.62 - - [16/Oct/2017:22:27:25 +0000] "GET /search-anteysex+photo.com/ HTTP/1.1" 200 23531 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
180.76.15.8 - - [16/Oct/2017:22:27:26 +0000] "GET /page-3/search-japanese+boobs+pics/random/ HTTP/1.1" 500 637 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"
root@ubuntu-1gb-nyc3-01:~# tail /var/log/apache2/access.log
157.55.39.238 - - [16/Oct/2017:22:27:20 +0000] "GET /page-5/search-african+aunty+without+dress+and+bra+big+boobs+sexy+photos/ HTTP/1.1" 200 24875 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
207.46.13.20 - - [16/Oct/2017:22:27:28 +0000] "GET /page-16/search-sa+tranny+nude+pics/ HTTP/1.1" 200 25587 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
207.46.13.39 - - [16/Oct/2017:22:27:21 +0000] "GET /desi+girl+in+loose+tshirt+pics/ HTTP/1.1" 200 27165 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.55.39.29 - - [16/Oct/2017:22:27:26 +0000] "GET /page-14/search-hot+sexy+aunty+boobs+in+saree+hd+picturescom/ HTTP/1.1" 200 25844 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
202.46.57.88 - - [16/Oct/2017:22:27:28 +0000] "GET /page-5/search-naked+pics+of+nicole+charming/ HTTP/1.1" 200 24457 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
157.55.39.149 - - [16/Oct/2017:22:26:51 +0000] "GET /page-2/search-big+boobs+pandora+peaks+bikini/ HTTP/1.1" 200 0 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
202.46.58.166 - - [16/Oct/2017:22:27:28 +0000] "GET /search-lesbian+sucking+boobs/random/ HTTP/1.1" 200 24133 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
164.132.161.3 - - [16/Oct/2017:22:27:31 +0000] "GET /7241 HTTP/1.1" 302 3638 "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)"
207.46.13.152 - - [16/Oct/2017:22:27:30 +0000] "GET /search-big+boobs+tite+studant/ HTTP/1.1" 200 23030 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
46.229.168.67 - - [16/Oct/2017:22:27:27 +0000] "GET /sunnyleone%20sexbeg/ HTTP/1.1" 200 20184 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)"

Now I see to read and understand quickly everything you wrote me

Thanks, just in time


All times are GMT -7. The time now is 02:51 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc