A few weeks back I bought a Raspberry Pi 2 with a vague idea of using it as a device for downloading torrents and streaming videos. Using a PC for this work seems like an overkill. But work, travel and other stuff didn’t leave enough time and I could work on this project only for a few minutes every week. There were a few problems and it delayed the whole project further. But after some effort, almost everything is ready and working fairly well. There are still some things which need to be sorted out, but it meets my needs for now. I will keep on tinkering to improve and sort out the kinks. This post is about the equipment, it’s configuration and use. I am not writing down all the details as most of procedures I used are easily found on internet. To begin with, I bought the following stuff:
Raspberry Pi 2 Board.
A plastic case.
Power adapter and USB cable.
A USB WiFi adapter.
HDMI to VGA adapter.
Cat5 Ethernet cable.
A memory card with OS pre-installed.
A pen drive I already had. Will replace it later with a hard disk.
Used keyboard, mouse and monitor of my existing PC.
1st Phase, WiFi:
First step of the project was to assemble everything and connecting it up to a monitor which was trivial. essentially it becomes a matchbox sized personal computer. As the seller had already installed an OS on it, I just booted and started working. But the WiFi adapter included just refused to work. I must have spent more time on this adapter troubleshooting than rest of the steps combined. Then I gave up on it, reset the system and downloaded a new version of OS. Surprisingly my older and new USB Wifi adapters both worked fine without any need of installing any drivers.
2nd Phase, Command Line and GUI access:
Once connectivity issue was solved, I configured SSH server and installed NoMachine for remote GUI login as unplugging and plugging in monitor and keyboard wires was becoming annoying. SSH had some problem in beginning, but was fixed easily. After this, I could access the Raspberry from my phone and PC. NoMachine can also be used from android.
3rd Phase, External Storage :
Adding this was pretty straight forward. Make a folder, mount it, apply necessary permissions and make it mount on every boot.
4th Phase, Torrent client :
Installed a torrent client, Transmission. I needed a torrent client which can be accessed via a web GUI. There are 3-4 options available and I tried them all before settling on Transmission. I still can’t get the web interface to run on every boot, but I am working on it and hopefully will be able to sort it out soon. I also tried Deluge and rTorrent, but wasn’t really satisfied with them. In the meanwhile, Transmission works fairly well.
5th Phase, External IP access:
Another step which I still haven’t figured out is how to allow access to Raspberry from an external network. I have downloaded and installed NoIp client which works well. But for some reason, port forwarding is not working as it should. So I can control the setup only when I’m connected to home network. It’s not really a big issue right now and I expect it to be solved soon.
6th Phase, Media Server:
Next step is to stream the videos to my TV and android tablet. Thankfully, it’s a smart TV so there is no need to buy another gadget like Chromecast to enable networking. Enabling this feature on Raspberry pi was achieved by installing MiniDLNA. Takes 2-3 minutes to install and configure. My android tablet can stream movies to TV but can’t play streaming videos by default. I tried ES File Explorer first which worked, but didn’t really liked it that much as it seemed too bloated. Installed iMediaShare which works pretty well.
So this is the basic overview of my new setup of downloading movies and watching them on TV and android tablet. I can access media on my Raspberry Pi, Tablet and Still needs some fine tuning but does the job for now.
Since last 3 months, I am on an shitty internet connection which as a download limit of just 15 GB and costs an arm and leg. But stuck with it because the only other option in my area is just as bad. So now it happened that this 15 GB download limit was getting over within 4-5 days and rest of the duration the bandwidth gets throttled to snail like 512 Kbps for rest of the month. All of this data was gone while I was not using any bandwidth intensive application. No torrents, videos, music streaming. Just the usual work on Opera and Chrome browsers and email.
This was bugging me too much because my current ISP (Failtel Fraudband) is notorious for shady business practices and ripping off customers. So to be doubly sure, I downloaded and installed Glasswire application to monitor my bandwidth usage.
As clear from the report above, Google Chrome is the largest bandwidth hog even though it’s my secondary browser and it mostly runs in background while I do my work on Opera. All of this bandwidth was consumed in just about 2 hours that I had it on. While looking for a solution, I found many people complaining of the same thing and found out that Google Chrome pre-fetches data from some most frequented websites and also automatically downloads some data from other links on the websites you are on.
Make webpages load faster
You can make webpages load faster by telling Google Chrome to prerender (preload) links. Google Chrome does this by predicting what links you might click, preparing them to load instantly for you.
- When you’re browsing a blog, you might click “next post” when you’re done reading. The blog can tell Google Chrome to pre-load the “next post,” so the page shows instantly when you click it.
- When you’re typing a web address in the address bar, Chrome will begin to prerender that page if it’s confident about which site you’re likely to visit (based on your local history). This will make the page show up faster when you hit enter.
Google’s Instant Pages search feature in Chrome is powered by Chrome’s prerendering technology.
While this is a good option for connections with unlimited bandwidth, it is just a nuisance for others. This means that the Chrome is pre-fetching data from websites which in a number of instances is just a waste of bandwidth. This option can be found by following steps as explained in above mentioned link:
In the top-right corner of the browser window, click the Chrome menu icon .
At the bottom of the page, click Show advanced settings.
In the “Privacy” section, check “Prefetch resources to load pages more quickly.” If you want to undo this permission, simply uncheck the box.
As I wanted to save bandwidth, I unchecked the box. After disabling this option, I kept Chrome running for half an hour and the bandwidth usage was minimal.
While looking for detailed logs, I also found out that Mozilla Thunderbird, which I use as my primary email client was also a big bandwidth hog, downloading 176 MB bandwidth in a single day. While I use it almost all day long, usage of this much data for text emails is way too much. By default, Thunderbird checks for messages every 10 minutes. I increased that interval to 30 minutes in Account Settings as visible in screenshot below.
Apart from that, a major bandwidth hog is the countless number of people sending messages with huge attachment, useless images and too many scammers and spammers with their malicious attachments. In the same menu, you can find option Synchronization & Storage.
In this, you can tell Thunderbird to not download any message bigger than your specified limit. I put in 1000 KB, but you can use any value you want. Additionally, the option just above “Synchronize the most recent” can also be given a lower value to decrease the amount of data spent on mostly un-needed traffic.
This work was done on Windows 7. I don’t know if the settings are same on Linux, but there is no reason that anything will be different.
This tutorial is meant for people who are want PHP5 on their Ubuntu computer along with Apache2, but can’t seem to make it happen. I’ll just explain it all as it happened to me.
I have an old pc running BackTrack 4 R2 which is used as a MySQL server. It’s Ubuntu version is Intrepid, which isn’t officially supported anymore. As it happened, we needed to run a small webserver on it with PHP support. Although Apache2 is installed and running fine, server wouldn’t handle PHP files and will just offer them as download instead of processing them.
Now the first step is to install PHP.
Now, as I said before, my Ubuntu version is old and not supported, I had to manually change my repository list at /etc/apt/sources.list and add the following lines:Continue reading
Download it from this link: http://www.jjamwal.in/dl/nodofollow.xpi
Open the XPI file from Firefox’s Open menu and it’ll be installed.
In case, you want to know how it’s done, read on.
There isn’t much to it. XPI files are just ZIP files with modified extensions. All one needs to do is unzip the file into a folder and open install.rdf file in a text editor. Find and modify the following line em:MaxVersion=”3.6″
Replace 3.6 with any version number of Firefox you want. I changed it to 6.6. :p
After that, all you need to do is to create a zip up these files in to an archive again and change the extension to XPI from ZIP. Take care of not zipping the folder in which you extracted the files. The archive should be of content structure as it is.