java - Making web crawler download multiple web pages locally -


i web crawler download browsed url's locally. @ minute download every site comes overwrite local file in each website visited. crawler start @ www.bbc.co.uk, downloads file , when hits overwrites file next url. how can make download them in single files have collection @ end? have code below dont know go here. advice great. url inside brackets (url) string used manipulate browsed webpages.

         url url = new url(url);              bufferedwriter writer;                  try (bufferedreader reader = new bufferedreader                     (new inputstreamreader(url.openstream()))) {                             writer = new bufferedwriter                           (new filewriter("c:/temp/data.html", true));                             string line;                             while ((line = reader.readline()) != null) {                                 //system.out.println(line);                             writer.write(line);                             writer.newline();                             }                                 }                              writer.close(); 

you need give files unique name.

you can save them in different folders (one root directory each web site).

or can give them unique name (using counter example).


Comments

Popular posts from this blog

css - SVG using textPath a symbol not rendering in Firefox -

Java 8 + Maven Javadoc plugin: Error fetching URL -

order - Notification for user in user account opencart -