According to, the term “retina” is actually something that came from Apple. It’s a marketing term that was used by Apple to bring a new technology to the general public – and retina does sound cool. Of course it’s named after the retina in the human eye – the part that takes light and turns it into signals for image data for your brain. It’s a cool name but what does it actually mean in tech speak? Well for all the hullabaloo it really just means high definition. There is no difference between “retina” and high def, and any other pet name companies try to throw on it. The long and short of it is images look better because there’s more data being displayed on the screen. Full stop. The easiest way to think about what’s going on is to think of your neighbor’s yard. It looked okay, green like lawns do, until he added a ton of fertilizer and additional seed. Now that same yard space is PACKED with grass. It looks like a golfer’s dream. That concept of fitting more into the same space, is essentially what retina/HD is trying to do. Your neighbor’s yard looked just fine before, but now that there’s more grass in it, and it looks amazing. The same thing is going on with retina. The pixels are both smaller and closer together and that technique presents a fuller image – because it is in fact, fuller. Retina screens have the capacity to show more image data in a given amount of space – usually twice as much. But also keep in mind those beautiful images they’re showing in stores on those retina screens are actually larger images. In order to show those crisp mountain-top waterfall scenes, the images they’re using have to have more data in them to pull that off. Otherwise a regular image can lose it’s brilliance, and sometimes even look blurry on a retina screen. The reason is your device is trying to take a smaller image (for the sake of argument a 200×100 pixel image) and stretch it across twice as much space. If you draw a smiley face on a rubber band and then stretch it out, it gets blurry right? Same thing with a non-retina image being displayed on a retina screen. The device has to take that 200×100 image and make it fit a 400×200 pixel space.

Quick Detour: Dots per inch (DPI) has nothing to do with it

DPI, or “dots per inch,” literally has zero place when it comes to website images. Dots per inch actually refers to an old printing industry term that means the number of times a printer head hits the paper to create an image for each inch. Let’s say I’m printing an image at 300 DPI. That means I’m asking the printer to take a 1-inch-sqare space and hit the paper 300 times within that 1-inch x 1-inch space.

Now bring in the web. “Printing” to the screen has no affect because well, you’re not printing. DPI never comes into play when you’re only working with images digitally. DPI is solely a set of directions your file tells the printer so your images print well. Your monitor never uses that info. You can practice right now if you want. Set your DPI of a specific image to 5, 50, and 500, but keep your image’s width and height the same. It will look exactly the same across all three images (see the gallery below for samples – Click here to download the zip file of all three). Now if you print them however, the first will look absolutely terrible, the second better but still blurry, and the third will be crystal clear (assuming you started with a quality image like stock photography).

Below are three images – all the same size (1000 pixels by 618 pixels) but with different DPIs. You’ll see they look identical on your monitor, but print very differently.

So how do I save Retina-ready images for my website?

That’s the million-dollar question. Part of the challenge is that this technology is doing what technology does – it changes. It improves, it alters, it drives us all mad.

Part 1: To create retina-ready images, simply save your images at twice the size so their devices can see images with enough data to display well. That’s the easy part.

Part 2: Getting retina images to show up for retina users and regular images for regular users can be tricky. Especially if you’re a beginner or not technical. The easiest way to handle web requests for retina images right now is to install a script on your site that senses when a retina device is accessing your site, and serve retina images only to that device. It can be as simple as saving out an image with a file name and then a second buffed-up one with the same file name but append something like _2x onto the end (or whatever your script uses). If a retina device is detected, all images served have to use that _2x version (i.e. filename_2x.jpg).


I’m sure you have them – this is a tough concept but an important one. And it’s one that’s very likely to change as more and more devices with their own rules appear on the market. In the end, it’s going to be a business call. How many resources are you going to throw at this? Think about how many images you have to process every day, and do you want to keep track of it all? And remember, it’s not that the images showing up for retina users look terrible, they just don’t look amazing.

Look, the average time a user is on a site is a couple of minutes, at the most. Users get their information and bail. It’s my opinion that this isn’t something worth going crazy over, but everyone has a different situation and you have to make a decision based on who you need to impress.

If you have a question, please leave it in the comments below and I’ll do my best to answer where I can. I also highly recommend for courses on retina, just be forewarned that you’re jumping into the deep end of the pool. It’s a simple concept with tons of different avenues that can cloud the waters pretty quickly.

I’d love to know what solutions you’ve come up with for your companies. There are literally new ideas coming out on how to handle this problem on a regular basis, so please share below so others can find something that fits their needs too.

Talk to you all soon!