while true; do { echo -e 'HTTP/1.1 200 OK\r\n'; cat index.html; } | nc -l 8080; done
Showing posts with label Scripting. Show all posts
Showing posts with label Scripting. Show all posts
Monday, 9 February 2015
One line web server
The following one line script will create a web server running on port 80 using nc (netcat):
Labels:
Command Line,
Linux,
netcat,
Scripting,
webserver
Possibly Related Posts
Sunday, 7 September 2014
Purge Removed packages
Packages marked as rc by dpkg mean that the configuration files are not yet removed. The following command will purge them:
dpkg --list |grep "^rc" | cut -d " " -f 3 | xargs -r sudo dpkg --purge
Possibly Related Posts
How to permanently delete ALL older kernels
This script will remove ALL versions but two, the currently active and the most recent of the remaining installed versions:
#/bin/bash
keep=2
ls /boot/ | grep vmlinuz | sed 's@vmlinuz-@linux-image-@g' | grep -v $(uname -r) | sort -Vr | tail -n +$keep | while read I
do
aptitude purge -y $I
done
update-grub
you can specify how many kernels to keep by adjusting the keep variable, if you set it to 1, only the active kernel will be left installed.
Or you can do it in one line:
Or you can do it in one line:
ls /boot/ | grep vmlinuz | sed 's@vmlinuz-@linux-image-@g' | grep -v $(uname -r) | sort -Vr | tail -n +2 | xargs -r sudo aptitude purge -ythat you can use in crontab.
Possibly Related Posts
Saturday, 30 August 2014
GitLab update script
I've recently installed GitLab and they provide easy to install deb and rpm packages but not a repository to help us keep our installation up to date. So I developed the following script that will check https://about.gitlab.com/downloads/archives/ for newer versions and install them when available:
#!/bin/bashI haven't tested this script on a CentOS machine so it might need some adjustments to work there.
OS="ubuntu"
OS_Version="14.04"
OS_ARCHITECTURE="amd64"
# Ubuntu/Debian:
INSTALLED_VERSION=$(dpkg -s gitlab | grep -i version | cut -d" " -f2)
# CentOS:
#INSTALLED_VERSION=$(rpm -qa | grep omnibus)
# Uses sort -V to compare versions
LATEST=$(wget -q -O- https://about.gitlab.com/downloads/archives/ | grep -i "$OS" | grep -i "$OS_VERSION" | grep -i $OS_ARCHITECTURE | grep -Eo 'href=".*"' | cut -d'"' -f2 | sort -V | tail -n 1)
PACKAGE=${LATEST##*/}
LATEST_VERSION=$(echo $PACKAGE | cut -d_ -f2)
echo ""
echo " Current version: $INSTALLED_VERSION"
echo " Latest version: $LATEST_VERSION"
if [[ "$INSTALLED_VERSION" != "$LATEST_VERSION" && "$LATEST_VERSION" != "" ]]; then
echo " Update to $LATEST_VERSION available!"
echo -n " Do you wich to upgrade? [y/N]? "
read answer
case $answer in
y*)
# Backup branding:
cp /opt/gitlab/embedded/service/gitlab-rails/public/assets/*logo*.png /tmp/
wget $LATEST
# Stop unicorn and sidekiq so we can do database migrations
sudo gitlab-ctl stop unicorn
sudo gitlab-ctl stop sidekiq
# Create a database backup in case the upgrade fails
sudo gitlab-rake gitlab:backup:create
# Install the latest package
# Ubuntu/Debian:
sudo dpkg -i $PACKAGE
# CentOS:
#sudo rpm -Uvh $PACKAGE
# Restore branding:
sudo cp /tmp/*logo*.png /opt/gitlab/embedded/service/gitlab-rails/public/assets/
# Reconfigure GitLab (includes database migrations)
sudo gitlab-ctl reconfigure
# Restart all gitlab services
sudo gitlab-ctl restart
rm $PACKAGE
;;
*)
echo "No change"
;;
esac
else
echo " Nothing to do!"
fi
echo ""
Possibly Related Posts
Tuesday, 17 June 2014
IPTables debugging
The following command will only show rules that have the action set to DROP or REJECT and omit the rules that didn't had any matches:
watch -n1 "iptables -nvL | grep -i 'DROP\|REJECT\' | egrep -v '^\s*0\s*0'"
This one does the same but with some colour highlighting, it will only show rules with matches, the words DROP and REJECT will appear in red and the word ACCEPT will be in green:
watch --color -n1 "iptables -nvL | egrep -v '^\s*0\s*0' | sed 's/\(DROP\|REJECT\)/\x1b[49;31m\1\x1b[0m/g' | sed 's/\(ACCEPT\)/\x1b[49;32m\1\x1b[0m/g'"
Possibly Related Posts
Saturday, 16 November 2013
Running a PowerCLI Script
I'm a Linux User and fond some interesting scripts for VMWare on the Internet, the problem was I didn't knew how to run them.
Now I finally learned how to run a Power Shell script, so here's how I've done it:
Save the script to a file with a filetype of .ps1
Open the PowerCLI prompt on your PC
Connect to the vCenter with the Connect-VICenter cmdlet
dot-source the .ps1 file. That way the function is know in your PowerCLI session
. ./yourfile.ps1
Call the function implemented by the scipt.
Labels:
PowerShell,
Scripting,
Virtualization,
vmware
Possibly Related Posts
Tuesday, 29 October 2013
Export everpad notes to HTML files
If you use Everpad you can use the following script to export your notes to html files. The script will write the notes into ~/exported_notes and the notes will be sorted under folders with their notebook names:
import os
import sys
import datetime
import sqlite3
export_path = os.getenv("HOME") + '/exported_notes'
#Create directory if it does not exist
if not os.path.exists(export_path):
os.makedirs(export_path)
# Create a connection to the database.
filename = os.getenv("HOME") + '/.everpad/everpad.5.db'
conn = sqlite3.connect(filename, detect_types=sqlite3.PARSE_DECLTYPES|sqlite3.PARSE_COLNAMES)
# Create a cursor object to do the interacting.
c = conn.cursor()
# Grab the columns we need.
sql = 'SELECT notebooks.name, notes.title, notes.content FROM notes'
sql += ' inner join notebooks on notes.notebook_id = notebooks.id'
rows = c.execute(sql)
# Iterate over the result.
for row in rows:
note_path = export_path + '/' + row[0].replace("/", "_")
note_path = note_path.replace(" ", "_");
if not os.path.exists(note_path):
os.makedirs(note_path)
note_path += '/' + row[1].replace("/", "_") + '.html'
note_path = note_path.replace(" ", "_");
note = open(note_path, 'a+')
note.write('<h1>'+row[1].encode('utf-8')+'</h1></br>')
note.write(row[2].encode('utf-8'))
note.close()
# Commit the changes and close everything.
conn.commit()
c.close()
conn.close()
Possibly Related Posts
Friday, 11 January 2013
Calculating total disk usage by files with specific extension
For example if you want to check how much space is being used by log files on your entire system, you can use the following:
find / -type f -name "*.log*" -exec du -b {} \; | awk '{ sum += $1 } END { kb = sum / 1024; mb = kb / 1024; gb = mb / 1024; printf "%.0f MB (%.2fGB) disk space used\n", mb, gb}'Just replace "*.log*" with the file extension you want to search for and the above will give you the disk used by the sum of all the files with that extension.
Labels:
Command Line,
Linux,
Scripting
Possibly Related Posts
Thursday, 29 November 2012
File rotator script
This is a script that I use to rotate some logs, the commented lines will tell you what it does exactly:
#!/bin/sh
#-----------------------------------------------------------------------
# FILE ROTATOR SCRIPT
#
# The purpose of this script is to rotate, compress and delete files
# - Files older than ARC_AGE are gzipped and rotated
# - Files bigger than SIZE_LIM are gzipped and rotated
# - Gzipped files older than DEL_AGE are deleted
#
#-----------------------------------------------------------------------
# Vars
DATE=`date +%F"-"%H:%M`
FILEDIR="/storage/logs/"
DEL_AGE="30"
ARC_AGE="1"
SIZE_LIM="20M"
# Diagnostics
echo "-= Rotation starting =-"
echo " Directory to search: $FILEDIR"
echo " File age to check for delition: $DEL_AGE"
echo " File age to check for archive: $ARC_AGE"
echo " File size to check for archive: $SIZE_LIM"
echo " "
# Compress all unconpressed files which last modification occured more than ARC_AGE days ago
echo "-= Looking for old files =-"
FILES=`find $FILEDIR -type f -mtime +$ARC_AGE -not \( -name '*.gz' \) -print`
echo "Files to be archived:"
echo $FILES
echo " "
for FILE in $FILES; do
# Compress but keep the original file
gzip -9 -c "$FILE" > "$FILE".$DATE.gz;
# Check if file is beeing used:
lsof $FILE
ACTIVE=$?
# Delete inactive files, truncate if active
if [ $ACTIVE != 0 ]; then
# Delete the file
rm "$FILE";
else
# Truncate file to 0
:>"$FILE";
fi
done
# Compress all unconpressed files that are bigger than SIZE_LIM
echo "-= Looking for big files =-"
FILES=`find $FILEDIR -type f -size +$SIZE_LIM -not \( -name '*.gz' \) -print`
echo "Files to be archived:"
echo $FILES
echo " "
for FILE in $FILES; do
# Compress but keep the original file
gzip -9 -c "$FILE" > "$FILE".$DATE.gz;
# Truncate original file to 0
:>"$FILE";
done
echo "-= Deleting old archived files =-"
FILES_OLD=`find $FILEDIR -type f -mtime +$DEL_AGE -name '*.gz' -print`
echo "Archived files older than $DEL_AGE days to be deleted:"
echo $FILES_OLD
echo " "
# Deletes old archived files.
find $FILEDIR -type f -mtime +$DEL_AGE -name '*.gz' -exec rm -f {} \;
echo "-= Rotation completed =-"
echo " "
Possibly Related Posts
Thursday, 15 November 2012
Rename files from upper case filename to lower case
The following one line script will rename every file (in the current folder) to lowercase:
for i in *; do mv $i `echo $i | tr [:upper:] [:lower:]`; done
Labels:
bash,
Command Line,
Scripting
Possibly Related Posts
Thursday, 27 September 2012
tail -f with highlighting
If you want to highlight something when doing ‘tail -f’ you can use the following command:
\x1b character can also be used as the escape character.
For the full list of control characters on Linux you can look at:
tail -f /var/log/logfile | perl -p -e 's/(something)/\033[7;1m$1\033[0m/g;'or if your terminal supports colours, e.g. linux terminal, you can use this:
tail -f /var/log/logfile | perl -p -e 's/(something)/\033[46;1m$1\033[0m/g;'If you need to highlight multiple words you can use something like this:
tail -f /var/log/logfile | perl -p -e 's/\b(something|something_else)\b/\033[46;1m$1\033[0m/g;'and if you want it to beep on a match use this:
tail -f /var/log/logfile | perl -p -e 's/(something)/\033[46;1m$1\033[0m\007/g;'If you find that perl is too heavy for this you can use sed:
tail -f /var/log/logfile | sed "s/\(something\)/\x1b[46;1m\1\x1b[0m/g"Note, that in the last example you have to actually type “cntl-v cntl-[” in place of “^[”
\x1b character can also be used as the escape character.
For the full list of control characters on Linux you can look at:
man console_codes
Labels:
Command Line,
Scripting,
Unix
Possibly Related Posts
Monday, 10 September 2012
MySQL Export to CSV
If you need the data from a table or a query in a CSV fiel so that you can open it on any spreadsheet software, like Excel you can use something like the following:
username is your mysql username
password is your mysql password
database is your mysql database
table is the table you want to export
The -B option will delimit the data using tabs and each row will appear on a new line.
The -e option denotes the MySQL command to run, in our case the "SELECT" statement.
The "sed" command used here contains three sed scripts:
s/\t/","/g;s/^/"/ - this will search and replace all occurences of 'tabs' and replace them with a ",".
s/$/"/; - this will place a " at the start of the line.
s/\n//g - this will place a " at the end of the line.
You can find the exported CSV file in the current directory. The name of the file is filename.csv.
However if there are a lot of tables that you need to export, you'll need a script like this:
Name the file something like: export_csv.sh and be sure to make it executable. In Linux, do something like:
To change that behavior, you could easily modify the "OUTFILE" variable to something like:
SELECT id, name, email INTO OUTFILE '/tmp/result.csv'Or you can use sed:
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY ‘\\’
LINES TERMINATED BY '\n'
FROM users WHERE 1
mysql -u username -ppassword database -B -e "SELECT * FROM table;" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > filename.csvExplanation:
username is your mysql username
password is your mysql password
database is your mysql database
table is the table you want to export
The -B option will delimit the data using tabs and each row will appear on a new line.
The -e option denotes the MySQL command to run, in our case the "SELECT" statement.
The "sed" command used here contains three sed scripts:
s/\t/","/g;s/^/"/ - this will search and replace all occurences of 'tabs' and replace them with a ",".
s/$/"/; - this will place a " at the start of the line.
s/\n//g - this will place a " at the end of the line.
You can find the exported CSV file in the current directory. The name of the file is filename.csv.
However if there are a lot of tables that you need to export, you'll need a script like this:
#!/bin/bashJust be sure to change the configuration section to meet your needs.
#### Begin Configuration ####
DB="mydb"
MYSQL_USER="root"
MYSQL_PASSWD='mypass'
MYSQL_HOST="127.0.0.1"
MYSQL_PORT="3306"
MYSQL="/usr/bin/mysql"
#### End Configuration ####
MYSQL_CMD="$MYSQL -u $MYSQL_USER -p$MYSQL_PASSWD -P $MYSQL_PORT -h $MYSQL_HOST"
TABLES=`$MYSQL_CMD --batch -N -D $DB -e "show tables"`
for TABLE in $TABLES
do
SQL="SELECT * FROM $TABLE;"
OUTFILE=$TABLE.csv
$MYSQL_CMD --database=$DB --execute="$SQL" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > $OUTFILE
done
Name the file something like: export_csv.sh and be sure to make it executable. In Linux, do something like:
chmod +x ./export_csv.shIf you want to have all of the exported files in a certain directory, you could either modify the script or just make the cirectory, "cd" into it, and then run the script. It assumes you want to create the files in the current working directory.
To change that behavior, you could easily modify the "OUTFILE" variable to something like:
OUTFILE="/my_path/$TABLE.csv"
Possibly Related Posts
Tuesday, 26 June 2012
Bash script lock file
Here's the lock file mechanism that I use for some of my bash scripts.
I check if the lock file is older than a reasonable time for the script to execute completely just in case the script or the machine running it crashed in the middle of it's execution and the lock file hangs the process forever...#!/bin/bashAnother approach is to store the PID of the current process in the lock file and check if the process is still running:
#Check if the lockfile exists and is older than one day (1440 minutes)
minutes=1441
LOCKFILE=/tmp/extract.lock
if [ -f $LOCKFILE ]; then
echo "Lockfile Exists"
filestr=`find $LOCKFILE -mmin +$minutes -print`
if [ "$filestr" = "" ]; then
echo "Lockfile is not older than $minutes minutes, exiting!"
exit 1
else
echo "Lockfile is older than $minutes minutes, ignoring it and proceeding normal execution!"
rm $LOCKFILE
fi
fi
touch $LOCKFILE
##Do your stuff here
rm $LOCKFILE
exit 0
#!/bin/bashThe first approach will permit parallel execution of your scripts but will give the first instance an head start of 1 day (or whatever the time you define in the $minutes variable). Whilst the second method will only allow the start of another instance of the script when the previous has terminated.
LOCKFILE=/tmp/extract.lock
if [ -f $LOCKFILE ]; then
echo "Lockfile Exists"
#check if process is running
MYPID=`head -n 1 "${LOCKFILE}"`
TEST_RUNNING=`ps -p ${MYPID} | grep ${MYPID}`
if [ -z "${TEST_RUNNING}" ]; then
echo "The process is not running, resuming normal operation!"
rm $LOCKFILE
else
echo "`basename $0` is already running [${MYPID}]"
exit 1
fi
fi
echo $$ > "${LOCKFILE}"
##Do your stuff here
rm $LOCKFILE
exit 0
Labels:
Command Line,
Linux,
Scripting
Possibly Related Posts
Tuesday, 19 June 2012
Watermark script
I was asked to create a script to scale down, correct the rotation and add a watermark to a bunch of photos, here's the result.
This script will shrink the images by 25%, then it will check the rotation based on the exif information and finally, it will apply a text watermark, a logo image can also be used, check the commented lines.
This script will shrink the images by 25%, then it will check the rotation based on the exif information and finally, it will apply a text watermark, a logo image can also be used, check the commented lines.
#!/bin/bash
if [ -z "$1" ]
then
location=$(pwd)
else
location=$1
fi
#Address of the watermark file\r\n
#WATERMARK="/home/ldavim/Desktop/watermark.svg"
# Check if the directory "watermarked" exists or create it.\r\n
if [ ! -e "${location}/watermarked" ]
then
mkdir ${location}/watermarked
fi
echo "Applying watermark, resize by 25% and rotate by exif info..."
#loop inside all the images in folder\r\n
for image in $location/*.jpg $location/*.JPG $location/*.jpeg $location/*.JPEG $location/*.png $location/*.PNG
do
if [ ! -e "$image" ] # Check if file exists.\r\n
then
continue
fi
newImage=${location}/watermarked/$(basename "$image")
#Scale image by 25%
convert "${image}" -resize 25% "${newImage}"
#Retrieve size of the image and divide the lenght by 2\r\n
size=`identify -format %[fx:w/76] $newImage`
#Correcting image rotation
exiftran -a -i "${newImage}"
#Apply the watermark and create a new image in the "watermarked" subfolder\r\n
##Using an image overlay
#composite -dissolve 20% -gravity southeast -background none \( $WATERMARK -geometry ${size} \) ${image} "${newImage}"
##Using Draw text
#convert "${newImage}" -font Sketch-Block-Bold -pointsize ${size} -draw "gravity southeast fill white text 0,12 'www.STYLETRACES.com' fill black text 1,11 'www.STYLETRACES.com'" "${newImage}"
##Using annotations
convert "${newImage}" -pointsize ${size} -font Sketch-Block-Bold -fill rgba\(255,255,255,0.3\) -gravity southeast -annotate 270x270+7+251 'www.STYLETRACES.com' "${newImage}"
convert "${newImage}" -pointsize ${size} -font Sketch-Block-Bold -fill rgba\(1,1,1,0.3\) -gravity southeast -annotate 270x270+8+250 'www.STYLETRACES.com' "${newImage}"
done
echo "Done."
#If you have installed zenity, a message will popup when the process is complete\r\n
#zenity --info --title "Watermarker!" --text "Process Complete!"
Labels:
Command Line,
imagemagick,
Linux,
Scripting
Possibly Related Posts
Monday, 11 June 2012
Check witch files a process has open
This script will output a list of the files that are open by a given process:
#!/bin/bash
PROCESS=$1
log_found=`ps faux|grep -v grep|grep $PROCESS|awk '{print $2}'`
if [ "$log_found" == "" ]; then
echo "No process found"
else
echo "Open files:"
for PID in $log_found; do
#ls -l /proc/$PID/fd/ | awk '{print $NF;}'
ls -l /proc/$PID/fd/
done
fi
Labels:
Command Line,
Linux,
Scripting
Possibly Related Posts
Tuesday, 29 May 2012
List users with running processes
Show the unique list of users running processes on the system
ps haexo user | sort -uShow the unique list of users running processes on the system, prefixed by number of processes for that user
ps haexo user | sort | uniq -cSame than above, but sorted by the number of processes
ps haexo user | sort | uniq -c | sort -nr
Labels:
Command Line,
Linux,
Scripting
Possibly Related Posts
Tuesday, 10 April 2012
Bash script to sort files into folders by filetype
This script will list the files in the current directory, create (if needed) a folder for each type of file and them move the files into their respective folders:
#!/bin/bash
file -N --mime-type -F"-&-" * | grep -v $0 | awk -F"-&-" 'BEGIN{q="\047"}
{
o=$1
#gsub("/","_",$2); # uncomment to make folders like "image_jpeg" instead of "image/jpeg"
sub("^ +","",$2)
if (!($2 in dir )) {
dir[$2]
cmd="mkdir -p "$2
print cmd
#system(cmd) #uncomment to use
}
files[o]=$2
}
END{
for(f in files){
cmd="mv "q f q" "q files[f]"/"f q
print cmd
#system(cmd) #uncomment to use
}
}'
Labels:
Command Line,
Linux,
Scripting
Possibly Related Posts
Wednesday, 7 March 2012
CloudStack LDAP
References:
First you need to configure LDAP by making an API call with an URL like this:
After you've created your URL (with encoded values) open your browser, login into cloudstack and then fire up your ldap config URL.
Now if you go back to cloudstack and under "Global Settings" search for LDAP and you should see that LDAP is configured.
Now you have to manually create the user accounts with the same logins as in your LDAP server or you can use the CloudStack API to make a script and "sync" your LDAP users into CloudStack, I've written a PHP script that does this.
You'll have to modify it to match your LDAP schema and you can get it after the break.
- http://docs.cloudstack.org/CloudStack_Documentation/Design_Documents/LDAP_Authentication
- http://docs.cloudstack.org/CloudStack_Documentation/Developer's_Guide%3A_CloudStack
First you need to configure LDAP by making an API call with an URL like this:
http://127.0.0.1:8096/client/api?command=ldapConfig&hostname=127.0.0.1&searchbase=ou%3Dpeople%2Co%3DsevenSeas&queryfilter=%28%26%28uid%3D%25u%29%29&binddn=%20cn%3DJohn+Fryer%2Cou%3Dpeople%2Co%3DsevenSeas&bindpass=secret&port=10389&response=jsonOr in a more readable format:
http://127.0.0.1:8096/client/api?command=ldapConfigNote the URL encoded values, here you have the decoded version:
&hostname=127.0.0.1
&searchbase=ou%3Dpeople%2Co%3DsevenSeas
&queryfilter=%28%26%28uid%3D%25u%29%29
&binddn=%20cn%3DJohn+Fryer%2Cou%3Dpeople%2Co%3DsevenSeas
&bindpass=secret
&port=10389
&response=json
http://127.0.0.1:8096/client/api?command=ldapConfigYou can use this link to encode/decode your url -> http://meyerweb.com/eric/tools/dencoder/
&hostname=127.0.0.1
&searchbase=ou=people,o=sevenSeas
&queryfilter=(&(uid=%u))
&binddn= cn=John Fryer,ou=people,o=sevenSeas
&bindpass=secret
&port=10389
&response=json
After you've created your URL (with encoded values) open your browser, login into cloudstack and then fire up your ldap config URL.
Now if you go back to cloudstack and under "Global Settings" search for LDAP and you should see that LDAP is configured.
Now you have to manually create the user accounts with the same logins as in your LDAP server or you can use the CloudStack API to make a script and "sync" your LDAP users into CloudStack, I've written a PHP script that does this.
You'll have to modify it to match your LDAP schema and you can get it after the break.
Labels:
CloudStack,
LDAP,
PHP,
Scripting,
Virtualization
Possibly Related Posts
Thursday, 5 January 2012
Organize your photos with a script
I have a lot of photographs and they where distributed over several external disks and computers, so I needed a way to organize them, I've searched the web and found a script that used the exif data to organize the images into folders by year, month, day. So I've picked that up and modified a bit to better fit my neads, I ended up with the script that you can check after the break.
I also use this script to move the photos from my camera to my PC.
Note: I don't remember the link from where I got the original script but I will update this post as soon as I find it. The original script can be found here: http://davehope.co.uk/Blog/sorting-your-photos-with-bash/
I also use this script to move the photos from my camera to my PC.
Note:
Labels:
Command Line,
Linux,
Scripting
Possibly Related Posts
Friday, 23 September 2011
Packet loss monitoring with zabbix
1. create a file named "packetloss" at this location "/etc/zabbix/externalscripts/"
Then under Item you create a new Item for that host/template
5. now check monitoring -> latest data for that host and you should start seeing packet loss values.
Done.
The number 10000 is Ping size, its very hard to spot packet loss when only sending a few bytes as a normal ping does.
Try increasing the size until you see packet loss then you know you pushing your equipment to the limit.
vi /etc/zabbix/externalscripts/packetlossnote: you may need to create the external scripts directory:
mkdir -p /etc/zabbix/externalscripts2. cut out and paste this in "packetloss" file
#!/bin/sh3. Make the file runnable by typing:
if [ -z $1 ]
then
echo "missing ip / hostname address"
echo " example ./packetloss 192.168.201.1 10000"
echo "10000 = 10000 bytes to ping with. the more you use the harder the network will have to deliver it and you start see packetloss. ping with normal ping size is kinda pointless, on LAN networks I recommend to use 10000 - 20000 and on Internet around 1394 (1500 - 48(pppoe + IP + TCP) - 58(ipsec)"
echo "Remember some firewalls might block pings over 100"
echo " "
fi
if [ -z $2 ]
then
echo "missing ping size"
echo " example ./packetloss 192.168.201.1 10000"
echo "10000 = 10000 bytes to ping with. The more you use the harder the network will have to deliver
it and you start see packetloss. ping with normal ping size is kinda pointless, on LAN networks I recommend to use 10000 - 20000 and on Internet around 1394 (1500 - 48(pppoe + IP + TCP) - 58(ipsec)"
echo "Remember some firewalls might block pings over 100"
echo " "
exit
fi
PINGCOUNT = 10
tal=`ping -q -i0.30 -n -s $2 -c$PINGCOUNT $1 | grep "packet loss" | cut -d " " -f6 | cut -d "%" -f1`
if [ -z $tal ]
then
echo 100
else
echo $tal
fi
chmod u+x etc/zabbix/externalscripts/packetloss4. in zabbix verify the host/template you want to monitor the packet loss on have a valid IP or host name and the correct "Connect to" selected.
Then under Item you create a new Item for that host/template
Type: External CheckSAVE
Key: packetloss[10000]
5. now check monitoring -> latest data for that host and you should start seeing packet loss values.
Done.
The number 10000 is Ping size, its very hard to spot packet loss when only sending a few bytes as a normal ping does.
Try increasing the size until you see packet loss then you know you pushing your equipment to the limit.
Labels:
Monitoring,
Networking,
Scripting,
Zabbix
Possibly Related Posts
Subscribe to:
Comments (Atom)