I want to put a Server with Raspberry Pi SD Card image and synchronize a lot of Raspberry Pi with this image. In this sense, if I update this main image on server in few minutes or hour get all my Raspberries synchronized.
Some ideas?
I want to put a Server with Raspberry Pi SD Card image and synchronize a lot of Raspberry Pi with this image. In this sense, if I update this main image on server in few minutes or hour get all my Raspberries synchronized.
Some ideas?
I want to put a Server with Raspberry Pi SD Card image and synchronize a lot of Raspberry Pi with this image. In this sense, if I update this main image on server in few minutes or hour get all my Raspberries synchronized.
If you mean, an .img
file containing everything then no, since:
In order to sync with a networked system, the pi actually has to be running.
You cannot re-write the entire SD card in the pi, period. You have to take it out to write an .img
onto it.
That means you have to sync the root filesystem (i.e., the second partition in most pi oriented images) while the system is running. This is essentially the same as maintaining a backup, except you are regularly syncing from somewhere instead of to.
Have a look at my answer here WRT maintaining a backup using rsync
. All of the information regarding which directories not to sync is even more important if you are actually syncing to the pi and not from it. So to repeat: be very sure you DO NOT include those directories in your sync. Note that most of them are just empty stubs in the distro .img
files anyway, but if your remote backup was created from a running pi, you might have naively included them. Also, you need to use --delete
, and if you don't explicitly exclude those directories and in the source they are empty, the sync will attempt to erase everything in them (very bad).1
At the bottom there's a paragraph about doing this with ssh
on a network. You are not limited using ssh
, but I believe it does has to be some kind of shell access. However, you could use NFS to mount a remote directory tree containing the filesystem and then rsync that locally.
Remember, the first url or directory path is the source and the second the destination, and that answer is written presuming the goal is a backup to, not from.
You may encounter some problems when the sync attempts to replace loaded binaries (it will fail; hopefully this does not create inconsistencies -- think about what I am trying to say in the footnote below), so make sure they system is otherwise idle (servers stopped, etc) when you do it.
1. In addition to what's listed there, you're going to want to exclude some things in /var
, namely at least /log
, /spool
, and /tmp
. Do a lot of thinking about stuff like this. Play around with rsync
a bit first to be sure you understand how it, --exclude
, etc., work. To be honest, I would never do this or recommend it to anyone. Instead, I would contemplate more refined ways to accomplish whatever it is you are trying to do. For example, if it is just about keeping the software updated without 20 computers all downloading the same updates from afar, you'd be much better off setting up a local repository, keeping that updated, then configuring apt-get update
or whatever to use that. Etc. Might be a bit more work initially but much less risk of headaches later.