Blogs

UserBeanCounters resources, vmguarpages and kmemsize explained

If you're hosted on a VPS, the below would explain if you are getting the resources that you paid for:

As mentioned in the resources (/proc/user_beancounters):

vmguarpages	0	30,000	2,147,483,647	4KB pages	\
      Memory allocation guarantee

This is the guaranteed RAM you get which works out to be:

30000 x 4 / 1024 = 117.1875 MB

Accordingly kmemsize is set to:

kmemsize	7,167,393	12,288,832	13,517,715	bytes	\
     Size of unswappable memory, allocated by the operating system kernel

Minimum kmemsize should be 10% of the vmguarpages, which is correct for the current setup:

12288832/1024/1024 = 11.7 MB == 10% of 117 (vmgaurpges)

Static apache-1.3.x and php-4.x compile for dotProject

  1. Download and unpackage the source files:
    $ cd /usr/local/src
    $ wget http://www.ibiblio.org/pub/mirrors/apache/httpd/apache_1.3.41.tar.gz
    $ wget http://us2.php.net/get/php-4.4.8.tar.gz/from/us.php.net/mirror
    $ tar -xvzf apache_1.3.41.tar.gz
    $ tar -xvzf php-4.4.8.tar.gz
  2. Preconfigure apache:
    $ cd apache_1.3.41
    $ make clean
    $ ./configure
  3. Configure, compile, install php:
    $ cd ../php-4.4.8
    $ make clean
    $ ./configure \
    --with-gd \
    --with-jpeg-dir \
    --with-png-dir \
    --with-zlib-dir \
    --with-freetype \
    --with-freetype-dir=/usr/lib \
    --enable-gd-native-ttf \
    --enable-memory-limit \
    --with-ldap \
    --with-mysql=/usr/local/mysql \
    --with-apache=../apache_1.3.41
    $ make
    # make install
  4. Configure, compile, install apache:
    $ ./configure \
    --prefix=/usr/local/apache \
    --enable-module=rewrite \
    --enable-module=so \
    --activate-module=src/modules/php4/libphp4.a
    $ make
    # make install

Install / Setup Nagios on Fedora 7

These are supplement notes I had taken down a while back doing an install of nagios and n2rrd on a Fedora-7 box and recently came very handy when doing the install in a redhat (RHEL-3) box as well:

Nagios Install:

# yum install nagios nagios-plugins nagios-plugins-http nagios-plugins-icmp nagios-plugins-ping

On RHEL3, I used (dags rpms) rpmforge. Here is my yum.conf for rpmforge:

[rpmforge]
name = Red Hat Enterprise $releasever - RPMforge.net - dag
baseurl = http://apt.sw.be/redhat/el3/en/$basearch/dag

Notes: I settled for icmp instead of ping, as it is a lot faster. However, icmp required setting setuid for "/usr/lib/nagios/plugins/check_icmp" for it to work. Also, had to rebuild nagios-plugins from source as root in order for the plugin to be installed.

Cloning oscommerce website for development

  1. Create dev.domain.tld site in hosting control panel.
  2. Create the database and user.
  3. Copy over the database and files.
  4. Update the database and user info in "includes/configure.php" and "admin/includes/configure.php".
  5. Also update the file paths in the configure.php files.
  6. Update the links to point to the development site:
    $ for x in `grep -r www.domain.tld * -l` ; do perl -pi \
      -e 's/www\.domain\.tld/dev\.domain\.tld/g' $x ; done
  7. Optionally disable SSL in "includes/configure.php".
  8. Change ownership of files as required.
  9. Login in as admin and change the cache location.

encode / decode base64 file

I use squrrelmail in text mode and recently I've had to retrieve a password at Dell. The reset password email however comes with html embeded in base64 encoded text. So my webmail did not show up the embeded html link. With some knowledge of uudecode, I was able to view the exact link to reset the password.

The sharutils package contains the GNU utilities uuencode and uudecode.

Here's a brief background and usage:

uuencode is a program that encodes binary files as plain ASCII text so that they can be sent through electronic mail. The program expands single characters that can't be viewed or printed as normal text into pairs of text characters, and the resulting encoded file is somewhat larger than the original binary file. This process is necessary to prevent mail, news, and terminal programs from misinterpreting the non-text characters in binary files as special instructions.

$ cat test.txt
hello world!

$ uuencode -m test.txt test.txt > test.txt.base64

$ cat test.txt.base64
begin-base64 644 test.txt
CmhlbGxvIHdvcmxkIQoKCg==
====

$ rm test.txt

$ uudecode test.txt.base64

$ cat test.txt
hello world!

First I copied over the actual base64 encoded text from my local email folder which looks similar to:

$ cat dell_reset
DQpUaGlzIGVtYWlsIHdhcyBzZW50IHRvIHlvdSBpbiByZXNwb25zZSB0byB5b3VyIHJlcXVl
... ...
bGluayBoYXMgYSBsaWZlIHNwYW4gb2YgdGhyZWUgZGF5cyBvbmx5LjxiciAvPg0KPGJyIC8+DQo=

I then added text in the beginning and end to specify the base64 encoded as below:

$ cat - dell_reset <<<"begin-base64 644 dell_reset.txt" > dell_reset.base64
$ echo "====" >> dell_reset.base64
$ cat dell_reset.base64
begin-base64 644 dell_reset.txt
DQpUaGlzIGVtYWlsIHdhcyBzZW50IHRvIHlvdSBpbiByZXNwb25zZSB0byB5b3VyIHJlcXVl
... ...
bGluayBoYXMgYSBsaWZlIHNwYW4gb2YgdGhyZWUgZGF5cyBvbmx5LjxiciAvPg0KPGJyIC8+DQo=
====

The text was then decoded using:

$ uudecode dell_reset.base64
$ cat dell_reset.txt
This email was sent to you in response to your request to modify your Dell.com account.<br />

Click the link below to go to the Dell site and modify your account:<br />
... ...

sendmail access.db by example

The sendmail access database file can be created to accept or reject mail from selected domains.

Since "/etc/mail/access" is a database, after creating the text file, use makemap to create the database map.

# makemap hash /etc/mail/access.db < /etc/mail/access

Below is what my access file currently looks like and can be used as a starting point. All internal addresses have been changed except for spammers!!

# by default we allow relaying from localhost...
localhost.localdomain           RELAY
localhost                       RELAY
127.0.0.1                       RELAY

# Allow Connect from local server IPs
Connect:207.44.206.144   OK

# Accept Mail
# accept mail from PayPal
paypal.com      OK

# Reject Mail
posterclub@e.allposters.com     REJECT
posterclub@email.allposters.com REJECT
plastmarket.com                 REJECT
jr@jrtr.org                     REJECT
7b2.606@fe01.atl2.webusenet.com REJECT
mysoldpad.com                   REJECT

# Discard Mail
1and1-private-registration.com  DISCARD
# forum admin mails:
fictionaluser@gmail.com         DISCARD

# Reject full mailbox
fictionaluser@linuxweblog.com ERROR:4.2.2:450 mailbox full
fictionaluser@linuxweblog.net REJECT

# Blacklist recipients
linuxweblog.net ERROR:550 That host does not accept mail

# Spam friend domains: exempt domains from dnsbl list checking
Spam:linuxweblog.org      FRIEND

# Spam friend users: exempt email users from dnsbl list checking
# example:
# Spam:user@domain.tld         FRIEND
# clients
Spam:fictionalclient@hotmail.com  FRIEND

# Auto REJECT via hourly cron added below

wizap Private Post

Totally private post

Analyzing proftpd xferlog file

Recently I've had to research on some missing files of a website.

When looking through the proftpd xferlog files, it was clear that the files were deleted by a user having ftp access.

The xferlog file is usually located at "/var/log/xferlog". However, since this was a plesk server, it was located at:
"/var/www/vhosts/{DOMAIN}/statistics/logs/xferlog_regular*"

A quick grep produced the files that were deleted out and could easily be recovered from a previous backup. Also, discovered the time and offending IP address of the person that did the deletes.

Full listing:

$ grep "_ d" /path/to/xferlog

Listing of just the deleted files:

$ awk '/_ d/ {print $9}' /path/to/xferlog

Below are some additional notes on xferlog anlysis:

sysstat in ubuntu

If you apt-get install sysstat, and sar returns:

Cannot open /var/log/sysstat/sa27: No such file or directory

Sar needs to be enabled before it can be used. The error message in this case is completely useless, and the solution I have found is as below:

Enable sysstat data collection by doing

# dpkg-reconfigure sysstat

or manually by changing value of ENABLED from "false" to "true" in "/etc/default/sysstat".

Then start sysstat via:

# /etc/init.d/sysstat start

Check "/var/log/sysstat/" for the missing file.

Run sar after about 10 minutes to see the collected data.

# sar -A

extract plesk 8.1 backup files

Package mpack is required in order to extract the backup contents.

# apt-get install mpack
# zcat /path/to/backup_file > backup_file.mime
# munpack backup_file.mime

The result is a set of tar and sql files that contain domains' directories and databases. Untar the directory as needed.

For example, to restore the httpdocs folder for the DOMAIN.TLD domain:

# tar xvf <DOMAIN.TLD>.htdocs

Reference:

How to extract web files, databases etc from Plesk backup manually?

Comment