Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If this is in reference to DoH then I found an upside. Generally, DoH servers allow HTTP/1.1 pipelining by default. This allows one to fetch DNS data in bulk over a single TCP connection. The DNS specification RFC 1035 suggests that computer users would be able to send multiple queries in a single _packet_: QDCOUNT is any unsigned 16-bit integer. The implementation of servers that can handle QDCOUNT greater than 1 has not happened. But at least with DoH I can send multiple queries over a single TCP connnection.

Once retrieved, I load the DNS data into the memmory of the "MITM proxy". This eliminates the need for DNS queries to be immediately proceeding associated HTTP requests for web pages, etc., or within some DNS cache duration period.

When I use other sources of DNS data^1, I eliminate the need for remote DNS queries altogether.

1. For example, I extract DNS data from Common Crawl data.

Indeed, it does not seem like DoH was implemented to improve life for computer users but, at least for me, it can be useful. It can also be useful for example to computer users who use remote DNS servers where their ISP is hijacking port 53.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: