+-+-+-+-+-+-+
|B|R|U|I|S|E|
+-+-+-+-+-+-+
home | categories | readme

Terminal

A stash of terminal commands.


batch converting folder with Pandoc:

for i in *.md ; do echo "$i" && pandoc $i -f markdown -t html -o $i.html ; done

app permissions change:

sudo chmod -R 755 Path\ to\ app\ file.app

batch upload with glitch.com:

~kinduff solution

reduce pdf

convert -density 200 -compress jpeg -quality 20 input.pdf output.pdf

different drivers for d1 mini boards.

to list connected ports:

ls /dev/cu.*

some boards appear to work with already installed drivers. A link with a driver that fixed issue for me is here. Installing the driver did bring up some warnings but everything appeared to be working after a restart.


quick exiftool reference

return all exif data:

exiftool path\ to\ file

write to a tag:

exiftool -artist=me a.jpg
exiftool.org exiftool.org/tagnames

shared server commands


$ pinky -l username
(includes plan)

in home directory:
$ nano .plan

see who else is online
$ who

write to others
$ wall

list logins
$ lslogins -u

reveal raspberry pi info
$ cat /proc/cpuinfo

traceroute
$ traceroute [ip/url]

wget

I mirror this website on a small computer from my home. It's unstable and unlikely to work for long so I haven't left the good hosting of Greenhost just yet. To copy the site across I was getting a bit confused for a long time with the intricacies of SSH and filepaths. Presumably it would make sense to have the whole thing running with Git and to be able to pull the latest version across.

But, as is often the case, I've found a hacky, simple, manual way of doing it that suits me.


//SSH to host computer

//navigate to web directory
cd /var/www/html/

//clone the website
sudo wget --recursive --no-parent https://bruise.in

//copy new/changed files to the web directory 
sudo rsync -r /var/www/html/bruise.in/ /var/www/html/

//remove the downloaded copy
sudo rm -r /var/www/html/bruise.in

categories: tools


~gg 05/23

+-+-+-+-+-+-+
|B|R|U|I|S|E|
+-+-+-+-+-+-+