AI as an assistant to designers

“We shape our tools and, thereafter, our tools shape us” –John Culkin

Popular digital tools like those included in Adobe’s Creative Cloud have radically shaped how designers work, requiring them to work for hours in front of a screen with a mouse or tablet. But how might AI shape how designers work in the near future?

In this post I will briefly examine some of the remarkable technologies that are being explored and research to help paint a clearer picture of how creating in an AI-powered future might look.


AI as a tool for Discovery

Patrick Hebron, a programmer, designer and professor at NYU’s Interactive Telecommunications Program, has written much about the possibilities of AI in design. Hebron argues in his article Rethinking Design Tools in the Age of Machine Learning that it is not correct to think of AI as a magic button that automatically creates great designs with just a click of the mouse. This is a skewed way to think of how AI works because, in the design process, many human decisions end up shaping the final outcome, which cannot be contained in a single magically AI-powered button.

Many human decisions end up shaping the final outcome, which cannot be contained in a single magically AI-powered button

Instead, Hebron advocates for thinking of AI as a tool for discovery and a way for designers to design by exploring solutions. The designer still leads an AI tool instead of the other way around. Hebron sees this playing out by way of AI systems quickly constructing novel variations of a design that the designer can then select, edit, tweak, and refine.

One example can be seen in AI generating a grid of headline fonts, offering various font and size combinations (see fig. 1). From this AI-generated list of designs, a graphic designer could then select the font and size combination they find most useful. Next, the AI system would generate another grid of even more nuanced options that are variations of the original selection, so the designer could fine-tune the design even further.

To put it more clearly, if the designer liked the option that included a font combination of Museo Sans and Adobe Caslon at 11pt and 8pt type, then the next options generated would be slight variations of that original selection. This example would then allow the designer to build expertise in making larger design decisions rather than on focusing on time-consuming technical skills required to manually explore numerous variations and combinations by way of panel menus. Over time, the AI system would get smarter through better teaching and observational learning, which means the AI system would be able to provide a designer with insights backed by data.

For example, the AI system could provide pro/con breakdowns for each option, so that font combinations and sizes that were more comfortable for the elderly were suggested when the elderly are the target users. This could go even further with the graphic designer designing an experience in which a website had the ability to adapt to various users, just as websites can be resized in responsive web design to accommodate a multitude of various screen sizes. The AI system could then identify a visiting user and adapt the content in real-time so it became a living design, customized for whoever is visiting the website.


AI design recommendations

Researchers at Adobe Research are already exploring the idea of “design suggestions” through the DesignScape system. DesignScape makes interactive layout suggestions for users to help them create better layout designs. DesignScape’s layout suggestions are based on fundamental design principles such as alignment, symmetry and contrast.

Not only does DesignScape make layout recommendations, it can also adjust design elements in real-time by automatically moving elements around to make them, in its opinion, better placed. The researchers behind this project conclude their research paper by stating that based on their research, suggestion-based design tools are best-suited for touch interfaces, which implies that this type of interface might require designers to work apart from their long-coveted mouses and tablets.

Another example of AI assisting in the design process can be seen in a research project at MIT that predicts what is visually significant in a design. With this particular project, an AI system is capable of making predictions in real-time about how a design might be received by a viewer as the graphic designer is designing the design. This system resembles a heatmap, like those used on websites, to display to designers and developers where users click first and click the most on a webpage.

ai Visual Importance

One example would be if a designer wanted to enable the AI system to look for contrast issues as they are designing, the AI system could analyze the document and use a red overlay on the design to denote areas of strong contrast, a yellow overlay to denote medium contrast and green overlay to denote light contrast. This visual overlay indicator could then help a designer lay out their work using predictive software connected to a database displaying information on various types of uses cases and demographics. It is easy to see how this type of interactive feedback could help designers work more quickly and make better-informed decisions that aren’t purely shots in the dark.

AI as an assistant production designer

Creative technologist Sam Snider-Hel shows in his article How I Taught A Machine To Take My Job how he trained AI neural networks to anticipate 3D objects that he wanted modeled and placed in his 3D environment. To accomplish this, he trained an AI system by placing objects in virtual 3D space for five hours. After Snider-Hel generated thousands of images and labels of data from this training exercise, the AI system could identify a pattern of visuals that correlated with his actions.

For example, the AI system identified that whenever Snider-Hel saw a bunch of rocks in his 3D scene, he would next place grass beside them. As a result, the AI system could anticipate and add grass to Snider-Hel’s 3D environment as soon as his mouse got close to the rocks. Snider-Hel speculates that this idea of anticipational design could find its way to Illustrator or Photoshop via a plug-in, so AI could take care of tedious work or even, provided it had enough information, work on the files even after the designer had gone home for the day. This is an early example of what deep learning is capable of when implemented into creative production software.

In the context of a designer’s busy schedule, or when a project comes in with a quick deadline, this idea of training AI based on a designer’s creative actions is compelling. AI could help a designer generate a bunch of ideas as a starting point, or the AI could work on the project while the designer is out of the office so a project will not be turned in late. This might be similar in form to “rendering” in 2D and 3D animation, in which the designer drafts various components in a scene, focusing on the big picture and overall narrative, and letting the computer output a final render in full detail and quality.

AI as a generator of content

An interesting experiment from AI researchers at MIT demonstrated that it is possible to create new video generated by an AI system. Researchers fed short video clips into their AI system, and then the system generated a full one second of video for each clip, based on predictions for what would come next in the scene. The most impressive video clip happened to be one in which the AI system generated the next one second of crashing waves in the ocean. Admittedly, the examples in their current form are crude, resembling what one might see when Photoshop’s Content Aware gets things wrong. But this technology provides an impressive look at what the future holds in automatically generating both still and moving content without needing to go out and physically shoot it.



All of these examples provide an interesting look at what the future of design could look like. So my question then becomes when computers handle a lot of the tasks that designers are used to doing, what tasks will remain for designers to do? What are the skills designers should be honing? What is the true value of a design in a world of smart computers that can do creative work?

Check out my post here for my thoughts on those questions and more.

Let’s be friends!

If you enjoyed this post please consider staying up to date by signing up for my email newsletter and then follow CreativeFuture on Twitter and Facebook.

Posted by Dirk Dallas

Dirk Dallas holds an M.F.A in Graphic Design and Visual Experience from Savannah College of Art and Design. In addition to being a designer, he is also a writer, speaker, educator & the founder of CreativeFuture & From Where I Drone. See what he is up to over on Twitter via @dirka.

One Comment

  1. […] an overview of where and how AI is currently used in the creative fields. This website also explore how designers work in the future. By analyzing the opportunities and challenges AI poses to the design field in the coming years, […]


Leave a reply

Your email address will not be published. Required fields are marked *