For the past few days I’ve been collaborating with artificial intelligence to design a range of furniture and homewares. It started when I saw a friend’s post on Instagram “Here’s some digital art experiments I’ve been working on with Midjourney. I could use this paragraph to talk about the vast implications of AI in art but instead I’m just gonna hit you with some silly captions” -Stackhat
The portraits were wild, they were hyper realistic surrealism, but beyond this they had an unfamiliar feeling to them.
“What’s Midjourney?” I asked
“I’ll send you an invite” he replied
Midjourney is an AI bot where you feed in text and it spits out images. It is accessed through Discord (messaging app) and my invitation offered a trial of the closed-beta version. I logged in, read the instructions and navigated to a thread called #newbies-16 and found a bunch of people throwing random words at a chatbot. You start by entering the command “/imagine” followed by a series of descriptive prompts. I had a go
“/imagine a robot in the rain drawn by a child"
It processed for a minute and then presented four image options. I then had the choice to upscale one or all of them, or make another set of variations (based on the original prompt or riff on a selected image).
The results were pleasing, the artwork looked hand made but it had that same strange quality I noticed in my friends portraits.
I offered my 2 year old daughter a turn, entering the first 3 words I heard her say, “bunny, cuddle, grandma”
Creepy! That’s the last time she’ll be using it.
I showed my partner over breakfast and she said “That’s wild! I wonder if you can get it to design furniture?”, and that’s when things took off…
“Organic timber chair”
“Retro futuristic mustard sofa”
Midjourney Furniture design - Retro Futuristic Mustard Sofa
“Psychedelic space teapot”
Midjourney object design - psychedelic space teapot
Holy moley, I instantly saw the ramifications this could have for design (both the practice and the things created). I’d heard of “generative design” but knew this as something that happened in CAD software, where you input design constraints and AI generates possible solutions. Midjourney is different, it’s open to all and capable of creating whole new concepts with nothing but a two word text command. My mind raced with so many questions: is this bot going to steal our jobs or will designers use it as a powerful tool for creativity? If this is intelligent is it a tool or a collaborator? How will we need to change our practice to collaborate with AI? Do designers become reduced to prompt crafters? Does this make us writers not designers? What about the knowledge, skill and life experience of designers? Could I make some of this stuff IRL?
It’s too early to answer these questions deeply, but I was instantly filled with a sense of concern. I’m a social designer, which means I value the experience of designing, and particularly co-designing with other humans, in ways that engage with story and making, to foster meaningful relationships between people and the material world. I wanted no part in this AI business, but did feel like I was on the verge of a zeitgeist and wanted to make sense of it for myself.
So I started playing, beginning with some more abstract terms. I had seen Midjourney described as a “an artmaking tool” that is trained in “style transfer” (pasting the aesthetic of an artist or art style on onto a specified scene or subject)
Could style transfer be applied to furniture and objects?
“Fluffy armchair in style of Studio Ghibli character”
Midjourney furniture design - Fluffy armchair in the style of studio Ghibli character
“Dining chair in style of Egon Schiele”
Midjourney furniture design - Dining Chair in the style of Egon Schiele
Wow. At first I entered famous artist and film-makers in my prompts, in doing this I noticed a few assumptions; artists would have more distinct styles the bot would recognise, the bot probably wouldn't know as many designers, and riffing on other designers work is considered copying - I’d learnt in design school this wasn’t the done thing…do the same rules apply here?
I tried a couple, just to see what happened
“Chase lounge in the style of Marc Newson”
Midjourney furniture design - Chase lounge in the style of Marc Newson
I wondered if the bot would re-interpret the famous lockheed lounge, interestingly it mashed the style from other pieces (maybe his car and/or suitcase?) onto another lounge, pulling its surface, colour and form in interesting directions.
Testing my assumptions further, I explored the bot’s built in bias by entering the name of an Australian First Nations female designer, seeing if it knew her work
“Table and chairs in the style of Nicole Monks”
Midjourney furniture design - Table and chairs in the style of Nicole Monks
That seemed to confuse it. But “table setting in the style of Nicole Monks” returned a piece that at least used textures that could be found in her work.
Midjourney object design - Table setting in the style of Nicole Monks
This surprised me and made me wonder what source data the bot was using - does it have its own library or does it just search the net? If it did “know” or had “found” Nicole what are the ethics of artificial intelligence reinterpreting the work of a First Nations designer, and the 60,000 years of cultural knowledge that sits behind her work…especially if that bot was being driven by a white fella?
I showed this work to Nicole and asked what she thought, she said “you are on point with these questions, and in fact I was just part of round table about Indigenous Cultural and Intellectual Property in the digital space, we are currently building a framework around these questions”
Turning to ceramics: “Japanese block print of spotty vase”, wondered what would happen when an art making practice was entered rather than an artist name. I upscaled an image and made variations on a couple until landing on one I liked
Midjourney object design - Japanese block print of spotty vase
I started feeling uncomfortable, the experiment was continuing into a murky place of cultural appropriation, even if AI was the one doing it, I was still prompting the design.
Midjourney object design - crumpled brass bowl with blue and white spots inside
This may be obvious to people experienced in coding, but I realised I was learning how to speak with a robot.
This is when I started thinking deeply about the agency of AI and my relationship with it in the design process; at first I (as the human) felt passive, I fed a few words in, pushed a button and (like a pokie) watched the machine roll through a series of image variations until it arrived at its result. But then I found myself saying to my partner “I just designed some pretty crazy furniture in Midjourney”, to which she replied “I’m not sure you can say YOU designed them”.
Then, when starting an Instagram account to share the creations I struggled to write the bio - how to summarise what is happening here? “Furniture and objects designed by AI, guided by a human designer”, I then replaced “guided” with “prompted” to adopt the language of this new practice. But then settled with “Furniture and objects co-designed by AI and a human designer” - feeling like there was some sense of collaboration happening, agency was shared between us.
And within the process of developing each piece (in what you could call a mico-co-design process) there is a definite exchange between the human designer and the bot:
Human: enters text prompt specifying object category, form, texture, colour, style etc.
Bot: interprets prompt and responds with image choices
Human: chooses image to upscale (i.e. saying to the bot “I reckon it’s a bit like this one”)
Bot: details the chosen image
Human: may then choose to do more variations on that image (saying to the bot “that one’s nice but can we do a few more concept sketches in a similar style).
This may go on for a while or the human designer may choose to rewrite the prompts to take the design in a different direction.
After realising I was still designing in this process (to some extent), I explored my agency as a human designer collaborating with AI. This involved moving beyond style transfer to taking time to richly “/imagine” objects in MY mind first, and then writing more intentional design prompts that described those objects.
“Large pendant lampshade made from layered white and dark green folded paper”
Midjourney object design - large pendant lampshade made from layered white and dark green folded paper
Another digital creator commented on this picture on Instagram
“Very cool. What you’re doing here is really hard to do right now. I bet in a few years it’ll be easier to get cleaner structures”
This made me realise I was contorting the bot to work with materials and construction methods that I was familiar with. But this is a whole new design practice, so why stick with materials of this world?
“Large pendant lampshade made from white and dark green stardust”
Midjourney object design - large pendant lampshade made from white and dark green star dust
Or why not see how the bot materialises non-material things
“Large pendant lampshade made from white and dark green fluffy ambitions”
Or if I’m collaborating with a digital being why not allow it to choose material from it’s world rather than mine
“Large pendant lampshade made from white and dark green computer software”
Midjourney object design - large pendant lampshade made from white and dark green computer software
I was curious to learn what computer software looked like, and the image that emerged struck me, it looked like a picture of the other side, a view from the software’s perspective, sitting in a matrix world peering back at me through the portal of the phone screen. We had seen each other.
On the day of finishing this reflection Midjourney moved to open beta, and already a bunch of Midjourney art, architecture, interior, fashion and animation pages are emerging…we're in for a ride, hold onto your agency folks!