can you get a list of problem URLs ?
for a website with many URLs, it's possible some URLs very slow to access. for example, a URL with pagination. the 1st page usually very fast, but very slow for 1000th page. when google bot try to access the 1000th page, it's highly possible it'll be time-out and get "No Response" from the server. even if you access the 1000th page directly in browser, it may take very long time until it reach some time-out. either google bot's time out or browser's time out or web server side's time out. on the other side, I believe google bot is quite smart. it won't waste time to crawl same problem URLs every day. just give you warning from time to time.