You can enter the long form command:
source ~/.bashrc
or you can use the shorter version of the command:
. ~/.bashrc
or you could use:
exec bash
You can enter the long form command:
source ~/.bashrc
or you can use the shorter version of the command:
. ~/.bashrc
or you could use:
exec bash
To complement and contrast the above commands with . ~/.bashrc and exec bash:
Both solutions effectively reload ~/.bashrc, but there are differences:
. ~/.bashrc or source ~/.bashrc will preserve your current shell session:
~/.bashrc into the current shell (sourcing) makes, the current shell process and its state are preserved, which includes environment variables, shell variables, shell options, shell functions, and command history.exec bash, or, more robustly, exec "$BASH"[1], will replace your current shell with a new instance, and therefore only preserve your current shell's environment variables (including ones you've defined ad hoc, in-session).
Depending on your needs, one or the other approach may be preferred.
[1] exec bash could in theory execute a different bash executable than the one that started the current shell, if it happens to exist in a directory listed earlier in the $PATH. Since special variable $BASH always contains the full path of the executable that started the current shell, exec "$BASH" is guaranteed to use the same executable.
A note re "..." around $BASH: double-quoting ensures that the variable value is used as-is, without interpretation by Bash; if the value has no embedded spaces or other shell metacharacters (which is not likely in this case), you don't strictly need double quotes, but using them is a good habit to form.
wget -r -l1 --no-parent -A ".deb" http://www.shinken-monitoring.org/pub/debian/
-r recursively-l1 to a maximum depth of 1--no-parent ignore links to a higher directory-A "*.deb" your pattern
You can use wget to generate a list of the URLs on a website.
Spider example.com, writing URLs to urls.txt, filtering out common media files (css, js, etc..):
wget --spider -r http://www.example.com 2>&1 | grep '^--' | awk '{ print $3 }' | grep -v '\.\(css\|js\|png\|gif\|jpg\|JPG\)$' > urls.txt
Note that this gives a list that duplicates URLs.
If you mirror instead of spider you seem to get a more comprehensive list without duplicates:
wget -m http://www.example.com 2>&1 | grep '^--' | awk '{ print $3 }' | grep -v '\.\(css\|js\|png\|gif\|jpg\|JPG\)$' > urls.txt
This will download all pages of the site into a directory with the same name as the domain.
Press Ctrl+Alt+T to open a terminal, then run one of the commands below:
HISTTIMEFORMAT="%d/%m/%y %T " # for e.g. “29/02/99 23:59:59”
HISTTIMEFORMAT="%F %T " # for e.g. “1999-02-29 23:59:59”
To make the change permanent for the current user run:
echo 'HISTTIMEFORMAT="%d/%m/%y %T "' >> ~/.bashrc # or respectively
echo 'HISTTIMEFORMAT="%F %T "' >> ~/.bashrc
source ~/.bashrc
To test the effects run:
history
For commands that were run before HISTTIMEFORMAT was set, the current time will be saved as the timestamp. Commands run after HISTTIMEFORMAT was set will have the proper timestamp saved.
With the commands
xrandr --output VGA-0 --auto
xrandr --output LVDS --off
The screen automatically transfers to the external display. It doesn't even need sudo powers. To find out the name of the displays just do:
xrandr -q
Which should give something like:
VGA-0 connected 1280x1024+0+0 (normal left inverted right x axis y axis) 338mm x 270mm
...
LVDS connected (normal left inverted right x axis y axis)
...
Extending the displays can probably be achieved in a similar manner.
The find command is the primary tool for recursive file system operations. Use the -type d expression to tell find you're interested in finding directories only (and not plain files). The GNU version of find supports the -empty test, so
$ find . -type d -empty -print
will print all empty directories below your current directory.
Use find ~ -… or find "$HOME" -… to base the search on your home directory (if it isn't your current directory).
After you've verified that this is selecting the correct directories, use -delete to delete all matches:
$ find . -type d -empty -deleteI would add-mindepth 1here, to prevent from deleting the starting directory itself, if it would be empty. It's not really probable case for$HOMEbut if you would use this on any other directory..