▲ | 1vuio0pswjnm7 6 days ago | |||||||
The solution I use is a script that periodically updates a simple URL database by filtering the TLS/HTTPS proxy log. The proxy software is configured to record the full URL for every HTTP request in a response header. Thus the log contains all URLs that have been accessed. Generally, I do not use a "modern" browser. I send HTTP requests using a variety of software. The log captures all HTTP requests from all software. This allows me to quickly search for past URLs, irrespective of what software was used to send the corresponding HTTP request. The proxy is bound to a localhost address. | ||||||||
▲ | wonger_ 6 days ago | parent [-] | |||||||
Do you log timestamps and page titles? About how many URLs do you log in an average day or week? Curious if your consumption is similar to mine or not. | ||||||||
|