We look at them every day. For many of us, we look at them for most of the day.

Wrist screens. Phone screens. Phablet screens. Tablet screens. Laptop screens. Desktop screens. TV screens. Cinema screens. Billboard screens.

Often we have multiple screens in front of us at once.

As web and app designers, we spend all our time thinking about designing for these screens. Designing how to present information on screens, designing how to navigate that information on screens, designing how to receive input and display feedback on screens.

But what if there were no screens? What if screens go the way keyboards are heading on phones and disappear? What if we liberated our content from screens and could present it to people by other means?

How would that change how we design?

Google Glass is still a screen based device, but its heading in the screen-less direction.

The iPod Shuffle was screen-less.

The Misfit Shine is screen-less (and often criticised for being so).

But how would we work differently if more devices were screen-less?

Researchers have already produced working versions of a device that projects it’s content onto your hand or arm (see Skinput turns your arm into a touchscreen).

Augmented Reality devices and apps are showing us glimpses of how screen-less worlds might look, of how we might design for a time when information is overlaid on the physical world around us instead of being stuck on a screen.

The explosive growth of smartphones forced many designers to think about screen size for the first time. Design became about being adaptive and responsive to the capabilities and sizes of the devices that content is being displayed on.

How would we adapt to no screen at all?