How to Download Image from Any Web Page in Java

how to download image from any web page in java

try (URL url = new URL("http://www.yahoo.com/image_to_read.jpg")) {
Image image = ImageIO.read(url);
} catch (IOException e) {
// handle IOException
}

See javax.imageio package for more info. That's using the AWT image. Otherwise you could do:

URL url = new URL("http://www.yahoo.com/image_to_read.jpg");
InputStream in = new BufferedInputStream(url.openStream());
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while (-1!=(n=in.read(buf)))
{
out.write(buf, 0, n);
}
out.close();
in.close();
byte[] response = out.toByteArray();

And you may then want to save the image so do:

FileOutputStream fos = new FileOutputStream("C://borrowed_image.jpg");
fos.write(response);
fos.close();

Java - Download image from URL

String finalURL(String url) {
HttpURLConnection con = (HttpURLConnection) new URL(url).openConnection();
con.setInstanceFollowRedirects(false);
con.connect();
return con.getHeaderField("Location").toString();
}

downloading and saving all images from given URL to desktop not fully working

Okay, so the page is working again and I got to take a closer look. Try this instead:

public static void main(String args[]) throws Exception {

String webUrl = "http://ramp.sdr.co.za/cache/1402NYFW/GoRedForWomen/";
URL url = new URL(webUrl);
URLConnection connection = url.openConnection();
InputStream is = connection.getInputStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);

HTMLEditorKit htmlKit = new HTMLEditorKit();
HTMLDocument htmlDoc = (HTMLDocument) htmlKit.createDefaultDocument();
htmlKit.read(br, htmlDoc, 0);

for (HTMLDocument.Iterator iterator = htmlDoc.getIterator(HTML.Tag.A); iterator.isValid(); iterator.next()) {
AttributeSet attributes = iterator.getAttributes();
String imgSrc = (String) attributes.getAttribute(HTML.Attribute.HREF);

System.out.println(imgSrc);
if (imgSrc != null && (imgSrc.toLowerCase().endsWith(".jpg") || (imgSrc.endsWith(".png")) || (imgSrc.endsWith(".jpeg")) || (imgSrc.endsWith(".bmp")) || (imgSrc.endsWith(".ico")))) {
try {
downloadImage(webUrl, imgSrc);
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
}
}

}

I'm using HTMLEditorKit's read() method directly instead of using the callback. This seems to work.

Downloading an image in java

When you get a 186 byte file, open it with a text editor and see what is inside. It could contain an HTTP error message in HTML format. If instead you see the first 186 bytes of your image file, then something is not working right with your program.

EDIT: From your comments it looks like you are getting an HTTP 301 response, which is a redirect to another location. A web browser handles this automatically without you noticing. However, your Java program is not following the redirect to the new location. You need to use an HTTP Java library that handles redirects.

Download images from a HTTPS URL in Java

You need to set the permission to allow the server certificate. Here it explains how to set so that all cert are trusted.



Related Topics



Leave a reply



Submit