In today's shocking revelation, a "study" proves citizens can indeed count birds without causing the sky to fall.
Meanwhile, the real challenge remains deciphering the cryptic 403-error prophecy from the #Varnish #cache oracle.
https://www.ucdavis.edu/news/can-citizen-science-be-trusted-new-study-birds-shows-it-can #birdwatching #study #403error #technews #HackerNews #ngated
Na toll WordPress – bzw Jetpack Boost. Wenn man euren hauseigenen Crticial CSS Tweak nutzt, funktioniert das Menu nicht mehr
Sollte in eurem Wordpress im Twenty-Twenty-Five Theme das Menu plötzlich nicht mehr klickbar sein, könnte dafür die Funktion
Laden von kritischem CSS optimieren im P
https://blog.lxkhl.com/na-toll-wordpress-bzw-jetpack-boost-wenn-man-euren-hauseigenen-crticial-css-tweak-nutzt-funktioniert-das-menu-nicht-mehr
#Cache #Jetpack #Theme #TwentyTwentyFive #Wordpress
#Srovnani #webserver.ů #LiteSpeed, #Apache, #Nginx
analýza výkonu, #cache, event-driven architektury, konfigurace, bezpečnosti, podpory #HTTP2, #HTTP3, #QUIC pro dynamické weby
https://danielberanek.cz/srovnavaci-studie-webserveru-litespeed-apache-a-nginx/
Mastodon Account Archives
TL;DR Sometimes mastodon account backup archive downloads fail to download via browser, but will do so via fetch with some flags in the terminal. YMMV.
the following are notes from recent efforts to get around browser errors while downloading an account archive link.
yes, surely most will not encounter this issue, and that's fine. there's no need to add a "works fine for me", so this does not apply to your situation, and that's fine too. however, if one does encounter browser errors (there were several unique ones and I don't feel like finding them in the logs).
moving on... some experimentation with discarding the majority of the URL's dynamic parameters, I have it working on the cli as follows:
» \fetch -4 -A -a -F -R -r --buffer-size=512384 --no-tlsv1 -v ${URL_PRE_QMARK}?X-Amz-Algorithm=AWS4-HMAC-SHA256
the primary download URL (everything before the query initiator "?" has been substituted as ${URL_PRE_QMARK}, and then I only included Amazon's algo params, the rest of the URL (especially including the "expire" tag) seems to be unnecessary.
IIRC the reasoning there is about the CDN's method for defaulting to a computationally inexpensive front-line cache management, where the expire aspects are embedded in the URL instead of internal (to the CDN clusters) metrics lookups for cache expiration.
shorter version: dropping all of the params except the hash algo will initiate a fresh zero-cached hit at the edge, though likely that has been cached on second/non-edge layer due to my incessent requests after giving up on the browser downloads.
increasing the buffer size and forcing ipv4 are helpful for some manner of firewall rules that are on my router side, which may or may not be of benefit to others.
- Archive directory aspect of URL: https://${SERVER}/${MASTO_DIR}/backups/dumps/${TRIPLE_LAYER_SUBDIRS}/original/
- Archive filename: archive-${FILE_DATE}-{SHA384_HASH}.zip
Command:
» \fetch -4 -A -a -F -R -r --buffer-size=512384 --no-tlsv1 -v ${URL_PRE_QMARK}?X-Amz-Algorithm=AWS4-HMAC-SHA256
Verbose output:
resolving server address: ${SERVER}:443
SSL options: 86004850
Peer verification enabled
Using OpenSSL default CA cert file and path
Verify hostname
TLSv1.3 connection established using TLS_AES_256_GCM_SHA384
Certificate subject: /CN=${SEVER}
Certificate issuer: /C=US/O=Let's Encrypt/CN=E5
requesting ${URL_PRE_QMARK}?X-Amz-Algorithm=AWS4-HMAC-SHA256
remote size / mtime: ${FILE_SIZE} / 1742465117
archive-${FILE_DATE}-{SHA384_HASH}.zip 96 MB 2518 kBps 40s
@stefano looks to be working now :)
Ech kurde. Właśnie odkryłem, że plugin od cache psuje mi wyświetlanie map na blogu.
Gdy jestem zalogowany widzę na mapie wszystkie markery POI, profil wysokości trasy i mam możliwość pobrania pliku gpx, ale bez logowania jest tylko mapa z narysowaną trasą.
Znowu trzeba będzie dłubać, albo wyłączyć keszowanie całkiem, bo i tak nie ratuje bloga przed FediDDoS-em, a cała reszta ruchu jest znikoma.
Pas de #déficit #caché
les #contrepouvoirs utiles dans l'ère de la #postvérité
#retraites
"Dans son rapport sur les retraites remis à François Bayrou ce 20 février, la Cour des comptes réfute le chiffrage fantaisiste du Premier ministre. Mais ses pistes pour le financement du système réduisent le champ du débat."
https://www.alternatives-economiques.fr/de-deficit-cache-pistes-de-reforme-dit-cour-com/00114091?utm_source=emailing&utm_medium=email&utm_content=21022025&utm_campaign=hebdo
How to clear the #cache on your #Windows11 PC (and why you shouldn't wait to do it)
"A quick #WordPress Super Cache fix"
https://www.phpied.com/a-quick-wordpress-super-cache-fix/ #cache
Das komplette #CPU-Lineup von #Intel für die #CES2025 ist bereits durchgesickert, inkl. Infos zu #Takt und #Cache. 22 #Prozessor-Modelle in vier verschiedenen Linien werden auf der Messe wohl vorgestellt. https://winfuture.de/news,147639.html?utm_source=Mastodon&utm_medium=ManualStatus&utm_campaign=SocialMedia
Po tym jak przy ostatnich publikacjach na blogu, serwer dławił się na jakiś czas (najprawdopodobniej przez federację), wymieniłem plugin od cache z WP Super Cache na podobno lepszy LS Cache, licząc że dzięki temu żądania z instancji fedi pobierających kartę podglądu dla wpisu nie zamulą mi strony.
Rzeczywiście strona po wymianie pluginu śmigała żwawo, więc miałem nadzieję, że przy następnej publikacji będzie dobrze, ale gdy przyszło co do czego, to okazało się, że niestety nic to nie dało.
Pogrzebałbym teraz, ale panel też zwraca 500 i 503. Trzeba przeczekać napór i poszukać rozwiązania później.
Bing has also officially dropped the cache link from its search results https://www.seroundtable.com/bing-drops-cache-link-officially-38566.html
Despite the #mastodon media #cache duration option being set to 14 (now 7) days, system/cache/accounts/ is 50gb. Blegh. Are there really that many different people tooting on my timeline?
$ du -sh *
18G avatars
36G headers
`tootctl media remove-orphans` cleared only 77MB.
`tootctl media remove --prune-profiles --days 90 --dry-run` claims to remove 36GB. That's all strangers I haven't fingered in more than 3 months? So retoots/boosts/replies/etc?
Ein Wochenend... Lieblingsfoto.
A weekend... Favorite photo.
by Artist: #ÉmilieMöri / #EmilieMori in Loc.: #Paris France
03/2024 - Title: "Cache-Cache" ("Versteckspiel") - #Art #Streetart #PhotoArt #Artist #Fotografie #Photography #Weekend #Cache
#APhotoLove
Today, we dropped our first post in a series about designing caches. This post by @xtrollyj00 introduces data locality, cache types, and the core concepts you'll need. By the end of the series, you'll have the knowledge to create your own high-performance cache! Read more here: https://fpgahero.com/blog/20241126-cache-1-introduction/ #FPGA #cache
Bald möchte #Intel neue #Prozessor-Modelle mit "lokalem #Cache" ausstatten. Der ist das Gegenstück zum 3D V-Cache von #AMD. Intel geht es aber nicht um den #Gaming- bzw. #Desktop-Markt. Der sei zu unrentabel. https://winfuture.de/news,146818.html?utm_source=Mastodon&utm_medium=ManualStatus&utm_campaign=SocialMedia
For the love of god if you're trying to convince your employer or an organization to "come over" to the #Fediverse, do NOT under any circumstances suggest that they set up a #Mastodon #instance!
Mastodon is the ONLY Fediverse platform that ... by default ... forces #server #admins to #cache, #copy, and #proxy all #media that passes through its server. This means that not only are server admins paying to host the media their users #upload, but they have to pay to host the media everyone else on the fucking fediverse uploads as well.
Other platforms offer this feature, but Mastodon is the only one that has this turned on by default.
This results in Mastodon server admins having to shell out thousands of dollars each month in #S3 hosting costs for no reason whatsoever.
There are much better alternative instance platforms than Mastodon.