Machine Learning: Are designers even needed anymore?
If designers are merely tasked with creating the pastiche and repetitive, maybe it’s for the best if the whole process gets automated.
This article is published as part of Us vs Them, a guest edit of It’s Nice That commissioned and curated by creative director Richard Turley. To read further articles from Richard’s takeover head here.
Anyone who has used a machine learning tool in the last year will know the size of the hole it blows in your brain. So, when we first started talking about the contents of this guest editorship in April, it felt a pertinent discussion point – the original Dall-E and MidJourney had recently launched and there was a bit of tension and excitement in the air. The speed of development since then has been rapid (as has the feeling of burn out from seeing so many images). Even the images we've used as illustrations for this piece seem old (they were made in May).
I like that machines have no egos, no desire to fart “their vision” over everything. Machines will offer you infinite options. They won't throw a hissy fit when you don't like something, or cry when they can’t use some illegible hype-y font. And machines have no desire to go to shoots, work with expensive talent, sleep, eat, or get paid. Don't get me wrong I like all those things about humans (especially the hissy fits), but I also like having an inert member of "the team" who just spits shit out. That seems like a great idea to me. As John Warwicker said in our piece on tomato: “Work fast, to the point of the work making itself and you are trying to catch it.”
Beyond an examination into the subject, I also wanted to do a sort of Dummies Guide/Wikihow explainer on one way we are using AI – in this case the design of a new soft drink brand – in the co-pilot mode, prompting us on everything from the naming to the logo and the can design – and breaking this particular process out for you. Maybe this is useful, maybe not. We're going to launch Fizuy-Lub regardless. – Richard Turley
The visuals for this feature were created as a case study showcasing how designers and AI can collaborate on creative projects together. Thank you to Iain Tait, Commercial Type and Jae Yeon Kim for collaborating with us.
You, a design principal, have just got off a Zoom call with a skincare client. They love your work and want you to pitch. You already know what they want to hear: a campaign to capture the hearts of the environmentally conscious Gen Z and a logotype that’s simultaneously bold and elegant. You traverse your studio – a chilly converted warehouse with workstations optimised for collaboration – to tell your most trusted designer the good news. Sparks fly, you exchange ideas. You’re proud of your now-mature team that operates like clockwork, a utopia of synergy and creativity.
But there’s a niggling sense of disappointment in your heart. You’ve seen the future. You have a feeling that your output will disappear in the swarm of skincare brands. All of them bold. All of them elegant. Maybe yours will lean towards mauve, rather than the more commonly used nude (whatever that means), but you feel like you’ve seen this all before.
1 of 13
01: To start, we asked the AI to create a series of logo tests. Featured here are generic data sets constructed from past images, found by prompting “SODA POP LOGO”. We used the GLID-3 model because it gives a wide variety of images (as opposed to using Dall-E or MIdjourney which produce more impressive images, but that's not the point at this stage.
You ask yourself: Are designers even important anymore? It seems like the design and advertising industries have fallen into the same old habits. We mine the internet for references, build the same mood boards and ultimately create the same pastiche work using the same tools. If we’re all copying and pasting the same ideas in exactly the same way, why don’t we just automate the whole process?
If designers are merely required to create the pastiche and repetitive, perhaps we can leave that to automation and lighten the load on graphic designers and creatives. But where does that leave designers in the design process?
“I often feel like designers themselves don’t want to copy other designers,” says Somnath Bhatt, a designer and artist who has used custom scripts and software to help him with his animation and illustration work. “But usually, it’s the client who insists that because this is so-and-so brand, it should look like this other so-and-so brand. I would hope that this bridge between the client and the designer’s vision can become clearer.”
“Maybe a factor why work keeps looking the same is because the algorithm is perhaps feeding us work that gets audience engagement in a specific way?”Somnath Bhatt
1 of 12
02: Next, we asked the AI to develop the initial packaging tests for the can. We used prompts such as “SODA POP CANS IN CYBERPUNK STYLE”.
Software that uses machine learning to aid design choices, or create full-blown designs, already exists. Netflix has previously used computer vision (a type of artificial intelligence) to determine which frame is most suitable as a thumbnail, scoring each frame based on a character’s expression to narrow down choices for its creative team. In the advertising world, services like Pencil and Bannerbear instantly create complete designs and visuals with only a few inputs.
On a larger scale, Alibaba’s Luban has created over six million banners for 200,000 merchants on its platform using its one-click image generation service, bypassing the merchant’s need for a designer to iterate promotional images for their hundreds of products. It seems like we’re at the precipice of being at the mercy of the full-service hub of content creation, where all clients need to do is enter a few keywords and are all set for the next product cycle.
1 of 17
03: We felt our drinks brand then needed a history. GLID-3 generated a fictitious account of our brand, showcasing a “SODA POP” through the ages – from adverts, menus and imaginary customers. We didn't use much of this but it would have given us routes further down the line to build from.
“It’s easy to understand why a Dall E image of Jack Black playing blackjack is so funny to us, but that’s a gag an AI would find impossible to explain.”Alif Ibrahim
Other tools that are available today, such as Adobe Sensei with its content-aware capabilities and Uizard that turns sketches into prototypes, are positioned as ways to expedite time-consuming tasks, which looks like the most direct benefit of adopting these tools. Daniel Wenzel, a senior art director at DIA Studio, has found that automation has helped him do some of the heavy lifting in his design process. “Automation has always been interesting to me simply because I enjoy being creative, but I don’t like to execute it. I like to get things done as fast as possible,” Daniel tells It’s Nice That. “I think it is becoming increasingly undeniable that AI will soon surpass us in craft, if it hasn’t already.”
The advancement in underlying technology has made it easier to use these methods, even within the span of only a few years. In 2019, Daniel experimented with machine learning for his thesis, Automated Type Design, where he was able to create over 100 designs by training 64 pixel images. Today, images that previously took days to train only take half a day, and he’s able to work with images 1024 pixels in size.
The signs are clear: perhaps it’s high time we start reconsidering the role of the designer today. Certainly, this development is a revelation for underfunded marketing departments and designers who are often forced to bite more than they can chew. If we already know what is expected from a fizzy drink company’s logo, or a streetwear brand’s launch campaign, perhaps these tools will be more than enough to serve those purposes. But will this lead us down a path towards obsolescence? If we are now able to create logos by entering a few keywords and to instantly resize tactical content, what will our junior graphic designers have left to do? Why would we hire illustrators if a tool can come up with any image you desire? Will a certain type of design go extinct?
1 of 12
05: We then used the term “Fizzy Love” as a prompt for GLID-3. Where it produced the words “Fizuy" and "Lub”, Which we bolted together to give us our name.
If the task of creating the bulk of visual media today can be left to computers, designers might have to become more critical about the work that they create. “When I was growing up, I read that if design isn’t in conversation with the environment that it’s coming out of, then how is it good design?” says Somnath. “Why does an app from Indonesia have to look like it’s been made in Silicon Valley? Perhaps we can be more open to our own surroundings and geopolitical realities, thinking more about localism and cultural nuance.” For Somnath, our cultural values ought to be part of our core process, rather than a surface treatment to show design’s utility beyond an industry.
“Why does an app from Indonesia have to look like it’s been made in Silicon Valley? Perhaps we can be more open to our own surroundings and geopolitical realities.”Somnath Bhatt
If this automation-led future of design becomes the norm, the impact of this change will be most felt by the workers who are currently tasked with creating repetitive designs, often at the mercy of more senior people at their studios. Fresh graduates, entry-level freelancers or those trying to break into the design industry may find it more difficult to find their footing as their roles are replaced. This issue, called skills-biassed technological change, is a concern that has come up again and again when it comes to automation, from driverless cars replacing long-distance truck drivers to robots replacing factory workers.
1 of 19
06: Our next step was to explore a logo design with our name by feeding “Fizuy Lub” back into the AI but using logo examples as the visual prompts.
Daniel believes that good designers shouldn’t feel threatened by automation. “I’m convinced that AI is more than capable of designing the world’s most legible typeface, but it is still not capable of curating something objectively wrong as subjectively good,” says Daniel. “That’s where I see the real art in type design – designing a typeface with character that stands out from the mass of soulless Helvetica clones.”
For all the advances made in AI, we are still pretty far from generalised intelligence. AI and automation are good at executing specific tasks, but not as good when applying what they have learned in different contexts, or in ways that make sense to humans. It will take years for automation to arrive and replace entry-level designers.
1 of 16
07: Commercial Type were then briefed with the AI’s explorations to create a logo for a fizzy drinks brand (our only direction, other than the AI findings, was to make it chunky). They took pieces of what they were drawn to to create this series of tests.
In the meantime, as the industry begins to shift, it’s perhaps the right time to reconsider what good design education looks like. If designers are willing to learn motion graphics and 3D software to keep up with the latest design trends and fulfil the needs of clients, perhaps they are able to adapt to a new role where they can act as curators of AI-byproducts rather than being mere executors during this transition period. Courses, apprenticeships and mentors ought to train new designers in what they should expect in the coming years.
We can thus treat automation and AI as partners in experimentation, offering new perspectives and unexpected results outside of the language that we’re already familiar with. For instance, in a recent work, Somnath wanted to replicate how a seashell grew, an almost pixel-by-pixel process, and employed the help of his friend Ted Wiggin to translate this metaphor into a creative process. “Whenever I draw something, it starts to grow in the same rhythm that a seashell grows over time,” he says. Far from the pinnacle of efficiency that we often associate with automation, the tool instead gives Somnath the opportunity to create fascinating illustrations that evolve autonomously. “It’s not very utilitarian or streamlined but I can’t myself grow a drawing without automation. The byproduct of that is that it creates these really interesting shapes, movements, lines and colours by this automated logic of growth and I can use that as the fodder for whatever final composition that I’m interested in generating.”
"If we are now able to create logos by entering a few keywords and instantly resize tactical content, what will our junior graphic designers have left to do?”Alif Ibrahim
1 of 17
09: We then needed a can design for Fizuy Lub, so used the Majesty Diffusion model to create a richer, more detailed image. We liked the highly coloured cans from our first experiment (#2), so began there. Then, when it started kicking out colourful tongues and bubbles, we steered the AI towards those versions. This gave us the background image for our can.
“Personally, I am happy when I no longer have to move pixels – when you no longer have to do anything yourself, except for being creative,” says Daniel. Take the Dall-E Mini image generator, which allows us to see images that we can only previously imagine, from babies doing parkour to a Teletubbies coat of arms – scratching an itch in our brain we never thought we had. We might be in a sweet spot where these tools are too naive to be fully autonomous, but can serve as prompts to reignite our creative thinking that has been dulled with endless Instagram stories that we had to resize. So, if a certain type of design was to go extinct, perhaps that is something that we should celebrate.
So how do we stop making the same thing over and over again? It’s important to take a step back and see design not just as an event of visual execution, but as a way to engage with our environment, one that falls outside of the realm of AI and automation. Designing doesn’t start when you open Photoshop and it doesn’t end when you save the final deliverable on your laptop. “When we send files to print, it’s already an automated process, but to be able to print in a very bespoke way, we all have certain halftone processes or Pantones that we mix as our creative bypass to this standard design-to-print thing,” Somnath says. Perhaps more designers will feel a bigger responsibility for the concepts that they come up with and this starts with how we navigate the world in the first place.
“That’s where I see the real art in type design – designing a typeface with character that stands out from the mass of soulless Helvetica clones.”Daniel Wenzel, DIA Studio
“Automation has always been interesting to me simply because I enjoy being creative, but I don’t like to execute it.”Daniel Wenzel, DIA Studio
In fact, automation is already an essential part of how many designers work, just not in the way that they expected it to be. Perhaps automation is what got us to this mess in the first place. “The [social media-based] algorithm definitely prefers a very specific thing. If a work looks a certain way, the algorithm boosts it more than if it looks another way. Maybe a factor why work keeps looking the same is because the algorithm is perhaps feeding us work that gets audience engagement in a specific way?” asks Somnath.
“The first step should be to stop referencing contemporary design. If everyone references each other, we will only go in circles and will never be able to break out,” Daniel says. “I think it’s super important to find references outside of your own bubble, whether it’s from architecture, sports, pop culture and so on. That’s also what currently separates us from AI. We are able to create ideas from the most unrelated things, experiences or impressions that only emerged as we as humans interact with society. Life is the best mood board and it’s unique to everyone.” It’s easy to understand why a Dall E image of Jack Black playing blackjack is so funny to us, but that’s a gag an AI would find impossible to explain.
It’s only human to have a fear of the unknown and this is especially true when it comes to an uncanny Lovecraftian piece of technology that threatens to replace us. More often than not, however, we find that technology ends up sitting within our existing practice as it intensifies our already-existing relationship with one another and the world. So if design feels too formulaic, perhaps it’s time to shake up what we understand it to be, starting with how we situate ourselves within this world and how we navigate it.
Us vs Them with Richard Turley
This story along with many others are part of a guest edit of It’s Nice That by Richard Turley. To read further pieces from Richard’s curation click on the link below.
Become an Extra Nice Supporter
This story, and the entire of Richard Turley’s guest edit series were made possible by Extra Nice and our supporters. To become a member and unlock an inspiring new way to explore It’s Nice That, and get your hands on some exclusive perks, head below.
About the Author
Alif joined It's Nice That as an editorial assistant from September to December 2019 after completing an MA in Digital Media at Goldsmiths, University of London. His writing often looks at the impact of art and technology on society.