I have a number of git repos that the original developers deleted - because I sync’d them to a usb stick with gitea. I think that is how you have to do it - never entrust a service, especially a free one, with your only copy of anything you value.
If the YouTube algorithm nukes your account and all your videos, you should be ready to upload them to a new account. Same with anything else digital.
My current is standard is one copy in AWS S3 which is super reliable but too pricy for daily use, and one copy in Cloudflare R2 or Backblaze B2 which might or might not be reliable (time will tell) but is hella cheap for daily use.
And now you have a new repository at /media/run/usb-drive/my-backup-repo with a master branch :) It's just a normal git repository, that you also can push to over just the filesystem
cd /media/run/usb-drive/my-backup-repo && git init --bare
Bare repositories don't have a working directory. You can still git clone / git pull from them to get the contents. You can also git push to them without clobbering any "local changes" (there aren't any).
Yeah, better in terms of saving space, but I think it confuses some people, hence I didn't use it in my above example. Previous time I recommended a co-worker to use the `push to a directory` way of copying a git repository, I made them create a bare repository, and they ended up going into the directory to verify it worked and not seeing what they expected. Cue me having to explain the difference between a normal repository and a bare one. It also confused them into thinking that a bare repository isn't just another git repository but a "special" one you can sync to, while the normal one you couldn't.
So in the end, simple is simple :) Unless you're creating remote repositories at scale, you probably won't notice a difference in storage usage.
I hear all that, but --bare is necessary in this case because git (by default) won't let you push to a non-bare filesystem branch:
~/temp/a:master $ git push backup
Enumerating objects: 3, done.
Counting objects: 100% (3/3), done.
Writing objects: 100% (3/3), 212 bytes | 212.00 KiB/s, done.
Total 3 (delta 0), reused 0 (delta 0), pack-reused 0
remote: error: refusing to update checked out branch: refs/heads/master
remote: error: By default, updating the current branch in a non-bare repository
remote: is denied, because it will make the index and work tree inconsistent
remote: with what you pushed, and will require 'git reset --hard' to match
remote: the work tree to HEAD.
...
To ../b
! [remote rejected] master -> master (branch is currently checked out)
error: failed to push some refs to '../b'
Gitea has a cron task that pulls-in changes on an ongoing basis.
If a snapshot suffices, a once off "git push" or "git clone" works (but that's not too far off from downloading a tar ball, is it?). If you want to have a up-to-date local copies of multiple repos, a SQLite-backed Gitea instance is the simplest solution.
An added bonus to using Gitea is flexibility in mirroring LFS objects, which can be sent to S3 or minio
This does not include the cron jobs to pull in changes others make daily/weekly/every N hours. I mentioned that Gitea is superior under very specific conditions (where one wants to have latest version available locally).
Have you tried running gitea? It's very light on resources, has good documentation, and also defualts to a main branch. It's also very easy to control where all the data is stored, and works well w/ sqlite.
That is an awful lot of contortions you are doing here, to seemingly justify a word change that has had well-cemented meaning within the tech community since it's inception.
We all know why this change exists, and why some people will attempt to persuade others of it's superiority. It is, however, just silly virtue signaling, and it's exhausting to hear and read.
It would require some very irrational and underdeveloped reasoning to assert this word has anything to do with oppression in 2023. There is no negative connotation, except in those who wish to perpetuate some weird sensation of altruism... ie. no one is safer or feels better simply because you choose to call it "main" rather than "master".
But it is truly a pain in the neck that different pieces of software and even different distributions of the same software now disagree about the default.
I’ve got a handful of active projects that go together that differ on master/main because they were created by different softwares.
I’d prefer “hitler” if everyone could just agree to always pick that. GitHub are the big pushers of this culture change. If they succeed, I salute them.
I'm willing to go out of my way to use a different word if it makes people feel better (the root of the master -> main transition). But this is the rare case where it benefits me by having to type two less characters whenever I refer to the branch! main is truly a win/win.
> I'm willing to go out of my way to use a different word if it makes people feel better
Which is just the thing, really. It makes no one feel better. It makes the privileged speaker feel better, with a false sense of virtue. It's a "look at how great I am" signal, nothing more.
No one is harmed or made to feel bad by using the word master. Sometimes the adults have to be present in the room, it seems.
Saving 2 characters is an equally silly excuse, but at least it has a realistic rationale. To that end, why stop at main - why not just 'm'? You can call it whatever you want in git.
It's because main is a goldilocks word for something like a default branch name. It isn't too long and isn't too short. It also isn't shorter in terms of syllables.
Yes, I've tried it (and actually run a personal instance myself), but I would never try to run an application meant as an webapp when I want to copy something from one filesystem to another, when git can do it already without any external programs.
Also, the `master` is just an example, it works for `main` as well, don't worry :) The created git repository on your usb-stick works like a regular git repository, you can use whatever branch names you want.
Btw, way to focus on the absolutely least interesting part of my comment, what I chose to name the branch...
The biggest reason to do this is thah it supports "mirror" repositories where it will keep your copy up to date, even using github keys to get at a private repo if you want.
Yes, and if you stop the process and start it again, gitea doesn't complain and picks up right up where it left off. Ditto if you lose internet connectivity. It's a well-designed piece of software. I considered using it as a BaaS and am actually thinking again of using it as one.
For this I have a NAS with a pretty basic script that runs nightly to clone any new repos I have and update those already backed up. They get organised into a directory structure mirroring that of Github: `./github.com/user/repo`
> If the YouTube algorithm nukes your account and all your videos, you should be ready to upload them to a new account. Same with anything else digital.
Do you know if this is a common occurrence?
Also, I'm only a YouTube viewer and am not familiar with all the creator tools, problems, communities, etc. But would a creator really re-upload all their back-catalog if deleted? Just to try to get back to views and things?
They're not banning channels based on swearwords (yet, anyway). They are demonetizing videos with swearing - in the first bit, if too much, maybe other rules, but nobody is getting banned from saying 'shit'.
Huh, that probably explains... I was in a youtube rabbit hole right around then when some videos suddenly wouldn't load, turned out that I might've been the final viewer of the (small) channel that had had been banned at that moment. I was wondering what the chances were.
edit: Seems like it. The channel[1] name probably raised some new flag, and Google did its thing. Seems fair, it's not like a reasonable moderator would know of a concept of a second chance or anything.
There were a whole bunch of artist and genre specific mixes I used to listen to on yt that are gone now. The uploaders accounts have all been nuked too. The sad thing is I can listen to it all on Spotify but it's not the same.. the creator's did not insignificant work to mix the songs together.
Maybe not on YouTube, but the gunTubers are having issues with YouTube changing their interpretation of the rules and instantly issuing 3 strikes against them for rule violations. And so, it'd be good to have a back catalog to upload to a different service to keep that older material available.
If the YouTube algorithm nukes your account and all your videos, you should be ready to upload them to a new account. Same with anything else digital.
My current is standard is one copy in AWS S3 which is super reliable but too pricy for daily use, and one copy in Cloudflare R2 or Backblaze B2 which might or might not be reliable (time will tell) but is hella cheap for daily use.