In Privacy Last updated:
Share on:
Cloudways offers managed cloud hosting for any size business to host a website or complex web applications.

If you live and work somewhere with slower internet and need to get a bunch of Debian and/or Ubuntu systems updated this apt proxy will make your life much easier and faster.

APT is the program that Debian and Ubuntu Linux distributions use to install and update their software. In their out-of-the-box configuration, when you use apt to install a program, e.g.:

sudo apt install inkscape

APT will download the software archive from the distribution repository servers and install it. This works great when the software package is small, there are only a few files that need downloading, and your internet connection is fast.

However, if you have a lot of software to install and have an office full of machines that all need updating and you have a slow internet connection it can take a long time to get everything downloaded and installed on every machine.

A proxy server, specifically a caching proxy, like apt-cacher-ng is a program that sits between the computer you are trying to update and the repository server that houses the software that you are downloading. The machine you are updating will request the software from the proxy server instead of the repository, and the proxy will then forward the request to the repository, download the software and hand it back to the updating machine.

The clever part is that the proxy will keep a copy of all the software files that it downloaded. When another machine on the local network tries to download the same files, i.e. it wants to install or update the same software, the proxy already has a copy that it can give to requesting machine without needing to download it.

After the first download, all subsequent downloads will zip along at the speed of your local network.


To complete this guide, you will need the following:

  • A screen and keyboard to plug into your Raspberry Pi. These are optional if you can SSH into your Raspberry Pi.
  • A non-root, sudo-enabled user on your Raspberry Pi.
  • A Debian or Ubuntu system on your local network.

Once you have all of these requirements log into a terminal on your Raspberry PI as the sudo user and move on to the next section.

Installing apt-cacher-ng

Before installing any new packages on Linux, it is always a good idea to perform a system update. This will ensure that your system is running the same versions of packages as are available in the distribution repositories and also that your local list of package versions is up-to-date. Doing this will ensure that you do not encounter any errors during the installation of apt-cacher-ng.

The following commands will update your system:

$ sudo apt update
$ sudo apt upgrade

Now that your Raspberry PI is up-to-date install apt-cacher-ng :

$ sudo apt install apt-cacher-ng

The installer will ask you if you want to enable HTTPS tunnels through apt-cacher-ng. You should answer “No” to this question. We will configure APT to proxy HTTPS connections through apt-cacher-ng instead of needing tunnels. In addition, you can change these options in the apt-cacher-ng configuration file later if you need it.

In keeping with modern software conventions, a systemd service file is created and enabled when you install apt-cacher-ng. This means that apt-cacher-ng will automatically start on boot and you can also manage apt-cacher-ng with the normal systemd service commands:

$ sudo systemctl start apt-cacher-ng.service
$ sudo systemctl stop apt-cacher-ng.service
$ sudo systemctl restart apt-cacher-ng.service

apt-cacher-ng is now running as a system daemon listening on port 3142 and ready to accept connections from apt.

Configuring apt on the Raspberry PI

The first system that we will configure to use apt-cacher-ng proxy will be apt on the Raspberry PI. The way that we will configure apt to use the apt-cacher-ng proxy will be to re-write the URLs in apt‘s sources file. The sources file contains a list of URLs of the repositories where the distribution’s software is available to download.

You will find the main sources file for Raspbian at /etc/apt/sources.list and on a fresh install looks like the following:

deb buster main contrib non-free rpi
# Uncomment line below then 'apt-get update' to enable 'apt-get source'
# deb-src buster main contrib non-free rpi

The only active (un-commented) line here is the first one i.e.:

deb buster main contrib non-free rpi

We need to modify this line so open the file with a text editor, here we use nano:

$ sudo nano /etc/apt/sources.list

Modify the first line so that it looks like the following:

deb buster main contrib non-free rpi

What you did here was to insert into the URL.

The IP address is always the IP of the local computer, often referred to as “localhost”. The :3142 part indicates the port.

Save and exit nano by pressing CTRL+o, ENTER, CTRL+x.

You will now need to make the same change to a sources file at /etc/apt/sources.list.d/raspi.list.

apt and apt-cacher-ng are now ready for testing.

Testing apt with apt-cacher-ng

Whenever you run apt it will cache a copy of any files it downloads. apt does this so that no it does not make any unnecessarily downloads and also to keep a local copy of the installation archives in case a package needs to be re-installed and no internet is present.

This local caching means that apt will not contact the proxy when you run apt update or apt upgrade if the repository has not changed. Therefore, in order to test the proxy, we will need to clear out apt cache manually. The following commands will clear out all apt‘s cached packages:

$ sudo rm -rf /var/lib/apt/lists/
$ sudo rm -rf /var/cache/apt/*

Now test apt by running an update and checking for any errors:

$ sudo apt update

You should see several lines of output that look like:

Get:1 buster InRelease [15.0 kB]

The URL beginning indicates that apt is receiving the update files from apt-cacher-ng.

You can also watch the apt-cacher-ng log file for errors by running this command:

$ tail -f /var/log/apt-cacher-ng/apt-cacher.log

in a second terminal. If you do not encounter any errors you can proceed to configure a Debian or Ubuntu system on your local network.

Configuring an Ubuntu or Debian System to use apt-cacher-ng

The Debian or Ubuntu systems on your local network that you want to benefit from the apt-cacher-ng proxy do not need any additional software installed. All that you need to do is to re-write their sources files so that they collect all of their updates from apt-cacher-ng instead of contacting the repository directly.

The change that you need to make the sources files is exactly the same as edit that you made to the sources on the Raspberry PI except that you need to use the IP address of the Raspberry PI in place of

This means that you will first need to get the IP address of your Raspberry Pi. The easiest way to get the IP address of the Raspberry PI is to run the following command in a terminal on the Raspberry PI:

$ hostname -I

This will print out the IP addresses that the Raspberry PI has. Use the first IPv4 IP address. Here, I will use the example address You will need to substitute the IP address of your Raspberry PI.

Back on the client machine, open the main sources file using a text editor, here we use nano:

$ sudo nano /etc/apt/sources.list

The file will contain lines with the same format as those on the Raspberry PI’s /etc/apt/sources.list. Here is an example line from a Debian Buster installation:

deb buster main

You need to edit these lines as follows using the IP of your Raspberry PI e.g.:

deb buster main

Edit all the lines in /etc/apt/sources.list and any other sources file under /etc/apt/sources.list.d/. Then delete any locally cached files for testing:

$ sudo rm -rf /var/lib/apt/lists/*
$ sudo rm -rf /var/cache/apt/*

Update the system again:

$ sudo apt update
$ sudo apt upgrade

The output from apt should indicate that the update files are coming from the apt-cacher-ng proxy by printing lines like the following that contains the IP of the proxy:

Hit:1 buster InRelease

This machine is now fully configured to utilize your new apt proxy. You will need to edit any new sources file that you add to this machine in the future including any new lines that get added for distribution upgrades.


The source files that we have looked at so far have all used HTTP connections. This is a deliberate design decision by Debian and Ubuntu because the installation archives have internal cryptographic signatures built-in that stop malicious tampering. HTTPS does not, therefore, add a great deal of additional security while adding considerably to the engineering burden of having a large number of geographically diverse mirrors.

However, there are several advantages to using HTTPS that mean some, non-official repositories, use HTTPS. apt-cacher-ng supports two methods of handling HTTPS repositories.

The first is to pass on the connections from the client directly to the repository server. This has the unfortunate consequence that the packages are not cached by apt-cacher-ng. If you wish to run apt-cacher-ng in this mode then open /etc/apt-cacher-ng/acng.conf with a text editor:

$ sudo nano /etc/apt-cacher-ng/acng.conf

And add the following line:

PassThroughPattern: .*

This configures apt-cacher-ng to allow the HTTPS connections to pass through from the client to the repository.

The second method is to modify the repository lines in the client’s source files so that the client connects to apt-cacher-ng via HTTP but apt-cacher-ng will then connect to the repository via HTTPS. The packages will download to apt-cacher-ng over HTTPS, then they will get sent to the client machine via HTTP. apt-cacher-ng is able to cache the packages and we do not lose the benefits of HTTPS.

The following sources line is for accessing the Docker repository over HTTPS:

deb [arch=amd64] buster stable

When you edited HTTP lines you added into the URL. When you edit HTTPS lines you need to add e.g.:

deb [arch=amd64] buster stable

Now the client machine will request the package from apt-cacher-ng via HTTP and apt-cacher-ng will download and cache the packages from Docker via HTTPS.

Administering apt-cacher-ng

A web GUI is available for managing apt-cacher-ng on your local network. In order to access this GUI, you need to point your browser to:

http://<Proxy IP>:3142/acng-report.html

Substituting the example local network IP,, gives us:

The first, and most important, section of the GUI, “Transfer Statistics”, provides you information on the amount of data downloaded from the repositories Vs data that has been served from the cache. The following image shows this section of the GUI:

The “Cache efficiency” section informs you about how many files apt-cacher-ng‘s served from its cache Vs which ones bypassed the cache. “Hits” indicate files that the proxy served from the cache and “Misses” are files that the proxy machine downloaded from the repository and added to the cache.

Managing the cache

The files that apt-cacher-ng downloads and serves to the client machines on your local network will become stale when the developers add a new version to the repository. When this occurs, these stales apt-cacher-ng must remove them from the cache as they are no longer required and occupy space on your drive. The Raspberry PI Reviews the cache and removes the stale files automatically.

When you installed apt-cacher-ng you also installed a cron file at:


This gets run by cron each day which clears the cache for you.

If you wish to review and clear the cache manually then you should log into the web GUI and click on the button marked “Start Scan and/or Expiration”. Doing this is usually not necessary, but you may need to do so if you are updating from rapidly updating repositories.

You now have an efficient apt proxy that will relieve the network burden of large, repeated, system updates in your office or home.

Interested in exploring Raspberry Pi, check out this online course.

Share on:
  • Elliot Cooper

Thanks to our Sponsors

More great readings on Privacy

Power Your Business

Some of the tools and services to help your business grow.
  • The text-to-speech tool that uses AI to generate realistic human-like voices.

    Try Murf AI
  • Web scraping, residential proxy, proxy manager, web unlocker, search engine crawler, and all you need to collect web data.

    Try Brightdata
  • is an all-in-one work OS to help you manage projects, tasks, work, sales, CRM, operations, workflows, and more.

    Try Monday
  • Intruder is an online vulnerability scanner that finds cyber security weaknesses in your infrastructure, to avoid costly data breaches.

    Try Intruder