- PoC with perl-WWW-Curl multi parallel HEAD requests + cache | one perl thread per connection? Heuristics on cache-re-use ; send large files to mirrorcache & serve small files directly or understand+adopt https://github.com/Firstyear/opensuse-proxy-cache of N parallel requests, 1 is a GET request and N-1 are HEAD requests to learn about existance, size, mtime on other servers. - or GET directory/ to discover many files at the same time, similar to MirrorCache Do special handling for unversioned repomd.xml* to use newest file from N mirrors.
use Net::HTTPServer Can use https://metacpan.org/pod/Sys::Mmap for cache - but probably slows things down for small files
/suse/bwiedemann/Export/contrib/zypp/pipelining.pl for perl-WWW-Curl usage
re-use code to find best mirrors - in bernhard@vm12b:~/public_html/linux/opensuse/bench-http
might add code for low-bandwidth ops later
add cache TTL heuristic similar to varnish
Goal for this Hackweek
check out MirrorCache scanner to see if it can be parallelized with curl's multi requests. If that turns out to be too hard, build an own local HTTP-proxy in perl.
- worked on https://github.com/bmwiedemann/varnishcontainer/tree/nginx
- sped up bench-http
- added nginxcache sls state to my zq1-salt
- https://github.com/bmwiedemann/bench-http/ | https://www.zq1.de/~bernhard/linux/opensuse/bench-http/
- https://w3.nue.suse.com/~bwiedemann/contrib/zypp/pipelining.pl = /suse/bwiedemann/Export/contrib/zypp/pipelining.pl
Looking for hackers with the skills:
Nothing? Add some keywords!
This project is part of:
Hack Week 22
This project is one of its kind!