

Not yelling, but pointing out, to people who also dont math, that if we assume $10 per 10k emails (or $1 per 1k, for simpler math), that’d be $84 for 84000 emails in a month, so you need to add another 0 to the figure (ie 840k emails in a month)
Not yelling, but pointing out, to people who also dont math, that if we assume $10 per 10k emails (or $1 per 1k, for simpler math), that’d be $84 for 84000 emails in a month, so you need to add another 0 to the figure (ie 840k emails in a month)
I’ve mainly gotten false positives, myself. When I’ve added another subdomain or something and the certificate gets set up differently, so then you get 2-3 emails saying domain X will expire, but if you connect to the url you see it has 80+ days left. Setting up your own monitoring solution is probably long overdue for myself, and it’s nice I’m getting forced to do it, in a way
I set up uptime kuma to also monitor certs this week when I got the reminder email about them stopping the email warnings, been using it for some time for uptime monitoring (mostly to see if some auto docker image update screws up my services) and the notification parts has worked nicely for that, so I’m also assuming it will work nicely for the certificates
Your files are now being placed in /opt/wireguard-server/config, not in the folder you have the docker-compose file, can you see them there with ls?
If you want thee files to be created in the directory where your compose file is you should change the path in the volume like so (notice the ./ on the left side of the colon):
- ./config:/config
Volume paths are specified with local-path:container-path, so changing the part before the colon specifies where your files are, and changing the part after the colon specifies where the container sees the files
Hope this makes sense, but I just woke up, so it might not
I’m in the no-bucket, but instead i spend time on issues, helping the community and sometimes code contributions to self hosted projects instead.
This is not taken into the account of the question, however, but should be considered as contributing.
(I also consider donating to be contributing.)
Yeah, or Shelfplayer
I’m holding off on trying Plappa until they add session support, since I like the stats, heh.
Tho I do have a Testflight spot for the ABS app, as well.
I do want to try Plappa, though! I really like the paperless-ngx app (Paperparrot) from the same developer
The ABS app is also still in beta on iOS, so unless you are tech savvy enough to either sideload or try to get in on one of the windows when users are booted from the testflight beta, you are going to have to use one of the 3d party apps.
Prologue just seems to do many things nicely, and user experience seems very important to the developer, so there is a huge crowd that swears by the app.
Does this sync stats for downloaded books ok?
EDIT: seems like not yet, https://github.com/LeoKlaus/plappa/discussions/15
I second this comment!
I’ve always used an nginx alpine image and have been very happy with it.
Not sure how this fork business is turning out and I have also heard conflicting opinions on wether to care or not…
If you do wish for something simple that is not nginx I’m also very happy with caddy, which can also handle ssl certificates for you, if you plan to make it publicly reachable.
Do you have a beefier internet connection than you would have without pirating?
Is the same true for a NAS/HDD?
I have a script that I found somewhere and have made personal tweaks to: https://pastebin.com/RJf9kajk Put it in a file called sub-clean.sh and check the instructions in the script on how to add it to bazarr
I have not noticed any issues with it, but there can of course be false positives (like someone saying the word “Facebook” in a series/movie), so your miles may wary. (this is why i leave the .trash.tmp file personally, so that i can restore the removed lines if needed)
and as always, this script is provided as is, but it might help someone out there
https://glitchtip.com/blog/2022-04-25-glitchtip-1-12
This blog post mentions that it should be possible at least!
Im currently using their free tier for a hobby project and have been happy with it. Have considered moving over to self hosting the solution, but have been keeping off on it due to resource contraints, but might make the leap soon! Would be nice to get use of the uptime pings, which currently would fill the event way too quickly for the free tier.
Did the sales increase as an effect of them taking it down?
Unfortunatly I have the same experience as you regarding Readarr and audiobooks. I think one of the bigger issues is that audiobooks aren’t indexed a lot on usenet. Both indexera I use get close to no matches when I’m searching for even slightly more niche books (I find epubs all right, but no audiobooks). To solve the issue of finding stuff you would probably need a myanonymouse (private torrent tracker) invite or something, which has focus on audiobooks.
I also find that it syncs my library incorrectly to books (book A as book B, book B also as book B), which then are almost impossible to sort out, and even when it does find books that are monitored I need to manually import them nine times out of ten.
My somewhat best experience was to just DL audiobooks normally and dump them in a ”black hole” directory, and let readarr import them like that.
However, the other issues got to me and I jumped ship.
Hopefully it will become a more mature product in the future, but for now I’m staying to manual handling.
Jokes aside, I do keep some harder to remember stuff written down in a README.md in my repo, but mainly most things are undocumented