The Timeless Principles of UX Design — Interview with Nosipho Nwigbo

What do ancient African circular huts and modern digital experience have in common?

Today, we're exploring how good UX design isn't something invented in the digital age, but
has existed for centuries and why these timeless principles still shape the most effective

websites and apps you use today.

Hi, my name is Nils, co-founder of Dinghy Studio.

As both a seasoned web developer and UI designer, I bridge the critical gap between design
and development.

while leading our fully remote digital strategy team.

Helping organizations create sustainable solutions that transform user experience into
business success, especially as AI reshapes how we interact with technology.

And today I'm thrilled to welcome back to the show my colleague Nusipo Nwikbo.

She is a UX research specialist at Dinghy Studio with the natural gift for asking the
right questions.

With a social science background from the University of Cape Town,

She combines her training in sociology and history with three years of specialized UX
research experience.

Hailing from Johannesburg, South Africa, Nosipho brings a unique perspective to creating
inclusive, human-centered experiences.

She's dedicated to understanding the core issues within products and bridging the gap
between users and technology.

And with that, let's get right into it.

Hey, Nosy, welcome back to the show.

Glad to have you.

So we're kind of, getting into UX research this month.

I think it's our focus topic for the newsletter as well.

And I wanted to interview you on all of the topics that we're laying out there because I
really loved the analogy that you picked up for the intro for everyone listening in today.

We're talking about UX research.

You know that we love the topic and we have

three segments that we want to get through in the course of the podcast today, and that is
UX research demystified.

I love that you're on for that topic, Nozzi, because uh it's a methodology that we use
daily anyway.

And I think as an agency, we kind of stumble upon these things so often.

So I'm really curious to...

to hear your takes on the myths that we're trying to bust here.

Right?

Second segment, research methods overview.

Generally, I think we'll try to make this more approachable because at least I feel that
when I speak with our customers, that UX research is still this thing that seems like,

seems to be like this black box of magic where people, I don't know, you talk about it and
they're all like,

Yes, yes, it sounds very useful, but we don't understand what's happening.

Kinda.

And I think we feel it doesn't have to be that way because it's actually, well, not black
magic, em but super straightforward stuff that even happens in everyday lives sometimes.

Let's see, let's see how we get into it.

And then just to address a hot discussion at the moment or in the last couple of weeks.

And that is of course the influence of our overarching digital friend, artificial
intelligence and how it comes together with the whole realm of UX research, right?

Exactly.

Awesome.

So you started the this month story about all of this, and with the case study of, I'm
probably going to pronounce it wrong, but the case study of the Roundelville.

Can you take us through it and yeah, maybe share what inspired you to get into it?

oh

So for the listeners who don't know, I am currently based in Johannesburg, which is in
South Africa.

And when I was looking at this newsletter, the first thing that I wanted to, or the first
thing that came to mind was how does UX manifest itself in my daily life?

What around me?

has principles of UX and how does this intersect with our daily lived experiences and our
daily lives products and tools.

And the first thing that came to mind was that

user experience is formulated on human experience.

And with that thought, I thought about what informs my own human experience and
immediately the roundtable came to mind.

A roundtable for those who don't know is a house that is often found in rural villages
along South Africa, Southern Africa.

particularly in KwaZulu-Natal, of which is where my maternal family is from.

So in our village, we have a lot of these structures, which is our homes.

They, in one plot, we'd have...

maybe one or two.

So one is where we have our vegetables, our gatherings, our kitchen, any sort of
celebration is held in that round of all.

And then the other is where we probably live, we stay, we sleep.

So there's different round of alls for different things.

There's different round of alls for different families.

So when you have a family unit, your extended family will live in the same plot, but
everyone will have their own round of all.

oh

Yeah.

So just to give a visual clue to what it looks like, it's a circular building and then it
has like a pointed, often thatched roof.

But as we have more access to more technological advances, these buildings are built with
bricks and roof tiles, et cetera.

But in the past, they used to be built with stone.

and mud and just using the natural resources we had around us.

Awesome.

So yeah, it's round.

It's a round.

Yeah.

And does it have rooms inside of it?

No.

No, it's just one room.

No walls, just one room.

And with that in mind, it being circular with no additional rooms, I started to think
about how is the structure built for the human existence?

How is the structure built for the community?

How is the structure built with

use the experience in mind.

And one of the things that I discuss in the newsletter is that it caters for Zulu culture.

So in our culture, when you enter a room, you have to acknowledge everyone that's in this
room.

And it's considered really rude for you not to acknowledge people when you enter a room.

And with it being circular, you're allowed to see everyone who's in the space.

Whereas if it was

For example, a rectangular or square home where there's different divisions in different
rooms, won't really be able to acknowledge everyone in the space.

So that's how the structure is built for user experience and that the user need is to be
able to acknowledge everyone in the space and the product that comes out of it, circular

room allows you to see and acknowledge everyone in the space.

Another example that I speak of in the newsletter is that it's quite rude.

to face your back to someone.

And when you're in a round space, you're less likely to show your back to anyone in the
room.

So everyone sort of sits on the periphery of the shape.

So you're able to see everyone else in the room whilst your back being to the back of the
wall.

So these uh are cultures and norms and needs that the product

allow for.

And basically with this piece, I just wanted to highlight how user experience not only
tackles digital products, but it comes from our daily lived experience.

And user experience is a means of articulating what the human experience is.

And I think this is just a very personal and very fun segment to open this newsletter
with.

I really loved it.

Like when I read through the first draft and I was like, Hmm, what's a, what's a round of
old, like, what, what are we getting into?

And then I read through it and the whole thing of that.

just, of course you can turn your back in roundable to someone, but just because it's
circular, right?

Like it, it encourages this more, this group dynamic of sitting around in a circle or like
standing.

face to each other, like also, I don't know, like so many games that we also played at our
son's birthday party.

Like everything is always in a circle.

So everyone in a group of 15 or whatever can look at each other.

Yeah.

And I, I really loved that.

And it reminded me that, and that's also something that we touched upon in, in the course
of this podcast more than once, I think.

is that this term of user experience is always a little bit, I don't know if it's clunky
or whatever, but like, so I think the way that it came to be or how it's understood from

what, like, I'm not even sure and we have a UX design agency, right?

But so what I understand from where it comes from is that it's supposed to describe the
experience with the digital part of a product.

Right?

So if we talk about the entire customer experience of like a customer of any consumer
product, that's just imagine, I don't know, whatever, like a drink or something.

And then the customer buys this drink and they have an experience.

then the customer experience is kind of the overarching thing.

And whenever it's about something digital, then we start to talk about the user experience
and that's fine and all.

I just kind of think the take that we always have at Dinghy is user experience is not UI
design.

And it's kind of not only the experience you have with the digital product.

And it's actually doesn't really matter if the website is technically slow or the copy
text is bad.

Like the user, like your customer, they don't care.

Like for them, it's just a bad experience.

Like mashed together.

That's, that's all that it is.

And I think this is why coming from the stance, really loved this, this, um, example,
taking something that's first of all was developed over centuries and is not digital at

all.

And it's cultural specific and it's just a solution to a problem the way you explained it.

And so I loved that, especially as an intro to, um, to our first segment that we're going
to talk about today.

And, um, that is.

Common myths around user experience research.

I think there's so much to discuss here.

Let's just get right into it.

Like, as I said in the intro, we have this so often.

Like when I take, when I talk to customers, they're usually amazed of when I just tell
them what we're going to do.

They're like, oh, this is not as complicated as I thought.

I'm like, no.

In most cases, we literally ask for people opinions.

So the first one you had was UX research is only for experts with big budgets.

Woof.

Yeah, so the assumption is that you need a really big budget because it's always a very
long drawn out exercise and it's usually conducted by super formal scientists who are very

accurate and very scientific and very, very grand just with their research.

But here at Dengue, we are very scientific and very precise and very rigorous.

but we do do it on the client's budget and we pride ourselves with it being rapid.

the research that we do utilizes simple methods that produce meaningful research.

Well, I can certainly relate to that.

I I sort of learned how to work alongside of UXR methods from you.

You turned this company around to be research focused anyway.

And so I had my fair share of learning here.

And I think with this, I was probably even one to think that myth number one would be
true.

I kind of thought...

user research.

Yeah, that's probably super useful.

But when are we going to do it?

Like, who's going to pay for this?

And my big learning from this is that this is actually like the way I feel about it now
and that we have proven over and over in the projects that we did since you joined is that

it's actually not something that comes on additionally in the definition or like looking
at the budget or like the time that you need for a project, I would say

It's an additional activity, sure, but it actually results into projects being quicker and
being less expensive.

And I think that's the counterintuitive part here because it's another thing you do, but
it's, yeah, it saves you over time.

And that is just something where I think we did that with small clients.

We did that with big clients, like with websites, with apps, with whatever really.

And so.

When I conducted my first interviews, I just thought it's amazing that I never go out of
an interview without having learned something that I didn't know before.

Like, I was, especially in the beginning, I was so sure, you know, I'm a seasoned UX
designer, but of course the button goes there and has this label.

It was just never the case.

You know?

Exactly.

And um just to go further into that point, I think the assumption is that when we are
doing this research, when it comes to big budgets, we're thinking of getting participants,

that it's really difficult to get participants.

But even if you're a small team, speaking to one customer is better than speaking to no
one at all.

And as long as you can speak to your target audience, as long as you can

close in that relationship between yourself and your customer.

That's a great starting point.

So try and speak to your customer just so I assume questions that you have about your
product and that in and of itself is UX research.

So.

Nice.

Awesome.

So myth number two, UX research is just asking users what they want.

So I definitely can see that one.

I think I can't count the times that I had to debunk that myth in a project planning call
when people said, well, then they're going to tell you their opinion and then what we just

do it or what.

And I'm like, well, no, of course.

so

So yeah, what's your take on that one?

Yeah.

So the idea is we use both attitudinal and behavioral methods to really dig deeper into
what is it that is needed by the customer?

What is it that the customer is battling with?

What comes easy to the customer?

What does the customer say that they want, but they behave differently too?

There's so many times Nils that we've been in sessions where the customer had said that I
don't like this.

but they're able to use it very easily.

Or they say, this is really nice, but you see them use it in the completely different way
to what it's supposed to look like.

So I think this is where choosing the correct method comes in.

using methods that assess both the attitudes and the behaviors of your users and not
taking things at face value of really being rigorous when it comes to the methods that you

use and the way in which you observe and interact with your customers when they are in
these test environments.

Interesting.

So like, mean, one thing that I have experience with is to conduct a usability interview,
for example, like a usability test where I, we would have like a design or like anything,

like something they can have on their screen.

And we would go on a Teams call or like a Google meet or whatever, anything where they can
share their screen.

So they open it, share their screen and we watch.

Right.

And in this situation, I can absolutely see what you mean.

Like they, they talk, they kind of brain dump onto you, whatever they think they're kind
of looking at or what they're thinking and so on.

And while they're talking, you still see them interacting with thing.

Right.

And yeah, I absolutely recall many situations where they were like saying this and that
and were kind of happily going through the whole thing.

em

We're going to talk about research methods in the second segment a little later.

Outside of usability testing, this like, do you have another example of where this like,
it feels almost like a little split between what they say and what they do?

Yeah.

even in, for example, user interviews, right?

When the user interview begins and the user might be a bit nervous and a bit scared to
tell what's on their mind or say what's on their mind, they might say something that they

think you want to hear.

But as that rapport is being built with the user and they're starting to become more
comfortable and you circle back to these questions and you're asking it in a different

way.

That's when you really get to the main chest and the core problem that they might have.

So, um, yeah, that's, that's definitely an example of really understanding what your
client is having is saying being empathetic to your users.

Yeah.

Just really being obsessed with the client and the participants and trying to understand
and decipher them and what they might mean and what are they saying and circling back and

asking follow up questions.

Interesting.

Do you have something like a, you don't take like the first five minutes for real anyway?

Or is this, know, like, you have like kind of preset?

Yeah, take it with a grain of salt.

Yeah.

So what I like to do is that I usually start off with like off the record warm up
questions just to get to know the participant that I'm talking about.

Talk about if they have a dog, if I see a cat on their screen, just to warm them up and
make them feel comfortable.

Also affirming that look, there's nothing to be scared of.

There's no right or wrong answer.

You telling the truth is really, really helping me build a better product.

So yeah, yeah, I had a couple

situations where people were worried, even though like you primed the entire thing, right?

Like you're saying, would it be possible to get on like a 20 minute call and I'm really
interested in your honest opinion, like you say all of the things, right?

And they still after like the first couple of minutes, like you get the feeling they're
holding back or they're right.

And you kind of try to get behind that or try to motivate them to read.

So then at some point they kind of.

Like as if they were trying to get behind the microphone or whatever they're trying to
say.

Well, but I don't want to talk bad about your product.

uh

Yeah, yeah, I've so many participants who are like, sorry, I'm so sorry, but this sucks.

And I'm like, tell me, tell me it sucks.

Tell me I want to hear it.

Yeah, interesting.

Awesome.

Let's get back to the myths, like this is the point I want to kind of put a pin in because
I think it's a very interesting culturally.

How okay is someone from a specific background with being honest about what they think,
even when they're asked directly for exactly this.

Exactly.

Exactly.

Next one, UX research is time consuming and delays projects well.

Yeah, that's a popular one, I would say.

Yeah, it is a popular one.

And I think we've managed to combat that internally, which I'm really excited about.

So one story that I tell time and time again is that when I was quite early in my career,
I had just joined Dengue and I still trying to find my footing, trying to figure out how

does UX research work?

How can I report back?

How does the team take in

information, what do they prefer?

And I was so obsessed with delivering these really long-winded reports that are super
academic, very rigorous, goes into every single aspect of research.

And that in and of itself started to delay projects.

And it is when we developed a rapid

methodology or rapid approach to user experience research.

That is really when we began to see the fruits of conducting this research.

As a service, think we've really perfected the art of producing research studies that are
not time consuming.

and that don't delay the project by ensuring that one, we have a project roadmap that
includes research as soon as possible and it's included in the roadmap.

So the researcher knows when are things due, when am I starting to look for participants,
et cetera.

So by including researchers in the project roadmap, that's how you're able to combat these
delays.

Two, as a UX researcher,

understanding how does your team take in information, what's important to your team, what
are things that can live just in the repository, and what are things that needs to be

surfaced to your team.

So I think this is something that is a case-by-case study, but I think it's something that
we really pride ourselves on of being really rapid when it comes to our user experience

research and bringing in these findings as soon as possible.

So I remember, of course, like, and I, first of all, I think I was very impressed by like
the first time you finished a report like that.

was like.

Wow.

like the level of detail about something like I think I didn't even I didn't even think
that it would be possible to describe something in such a comprehensive way.

Like, especially because I mean, I don't know, we started an agency after working in
startups and bigger companies and whatever.

And so I think an agency is the special case in and of itself, because like, we always
have projects running in parallel.

Like stuff is depending on the customer's timeline.

so kind of naturally, I always feel like running an agency as much as you try to keep it
calm.

It sort of kind of turns into a hot potato tossing contest anyway.

I, yeah.

And at the time I was really worried that we would freak you out.

with this, like with this entire situation.

And I was really wondering how, how we could fit research into this.

Um, but so my question is, think to stack on top of, of myth number three, would you say
this myth might even exist rightfully so because, yeah.

Definitely.

I think for research to be rigorous, takes time.

And also based on the different methodologies that are taken or chosen, that too takes
time.

So it is rooted in something, but I do think there are ways to cut corners.

You know, in order to deliver findings at the right time, because it's better for you to
bring findings that are fresh and needed for your team to move forward than bring it when

it's too late or it's not useful to your team anymore.

So you mentioned that it was very helpful for you to see that research is already included
on a project roadmap so you know when it comes and when to prepare.

We talked about this fact so many times outside of research at all.

Like I spoke about this with stakeholders, product managers.

legal people, copywriters, video editors, like everyone, everyone had the stance that they
said, well, I love this wireframing phase so much, or this concept phase of the project

that you're running, because I know what to prepare for.

Like usually, it's like four weeks to the launch, people call us up and say, now please
shoot like three videos and edit them until well, basically, actually only in two weeks,

because we have to take them online and also transcribe them and so on.

And they're like, how on earth am I supposed to do quality work here?

And so do you think that is something that is usually overlooked as well?

Like that is that you just don't know when to come in and when to provide value with the
research that you're supposed to do.

So even if we take it to most, most companies work on this agile, lean sort of work mode,
right?

And every morning you'd have your stand up, you'd have your scrum and only user design is
included in those meetings.

They are checked up on, they are updated, they are allowed to bring in any stumbling
blocks that they might have, anything that might delay the project.

And user experience research is sort of not included in that.

And that's an aspect that also is dependent on the project.

That's something that can also delay the project.

And just taking it down to the grassroots of including UX research into every single
thing.

that's done, you know, it has to be included into the process of delivering a product.

And when this aspect of this department is included, that is when they're able to work
alongside the entire team and ensure that projects are timed in a way that does not delay

the project as a whole.

I It's absolutely like, absolutely.

Like whenever we took someone along or like even if now it sounds maybe a little
intimidating that everyone has to be on the daily call every day.

Like this is not necessarily what it's going to be, but I think the essential part is that
whoever is supposed to deliver something to a project, they kind of got to know.

Like it's, mean, who doesn't like to know upfront that something's coming?

You know?

Exactly.

So like for me, this is something that is just supernatural to if you're, if you're
designing a website and it should have like photos, videos, and copy texts on it, and it

should look nice and it should well be on the internet as a website and not as a static
image.

And you probably need a copywriter, a photographer, a videographer.

like a front end developer and someone who sets up a web server.

And all of them should kind of know that you're making a website.

Otherwise this will end in chaos.

Exactly.

Right.

Yeah, for sure.

Because my biggest pushback that I hear is, well, what you're proposing sounds like chaos.

And my biggest comeback to that is honestly not doing it to me sounds like chaos because
with the waterfall method you're doing, everybody just, you know, will come in way too

late.

The ones that are, that are, that come in at the end of the project are usually the ones
that are then maximum stressed because everything is delayed and took too long and that

they still want to

you know, be on time with the deadline.

so whoever comes in last gets the shortest time to do their thing.

this is All right, cool, cool, Next one.

Super interesting with regards to the intro.

UX research is only for digital products.

Yeah, I think we got into that in the intro, right?

That was an excellent example of how UX research or UX in general is not only for digital
products, but seeps into our everyday life.

Another really cool example was that I was reading this case study of when self checkouts,
like at a grocery store.

Yeah, was um brought into the market.

Not everyone knew how to operate it.

They did not know how to use it.

And that's how it was assessed in the actual shop to see how does this digital market
exist in the real world.

Right.

So actually into interacting with people who are coming into the store.

do they navigate the store?

What makes them decide to self-check out themselves or go to a cashier?

Is there a time basis?

Where are going to put it in the store?

How are we going to set it up, etc.?

Those are ways in which it's not only a digital product, but you're looking at the entire
cohesiveness of the shop that is utilizing this digital product.

That's a great example really, because it got me thinking that we do, as we said in the
beginning, at least we think still in these boxes of customer experience, user experience,

brand strategy, like all of this sort of stuff that, well, defines the overall experience
or relationship of a customer to a company.

I mean, we just see every day, I would say, that digital products and they don't...

they don't exist m necessarily only in the digital realm anymore, right?

Like I feel the lines are blurring so much between physical and digital stuff that it's,
that I almost feel we should get rid of this whole UX term altogether.

Let's just call it customer experience, or if somebody's not buying from you, then it's
well, just the experience.

And that's it.

Because it makes it sound so fancy, but it's not.

everybody, if you think for just a second about yourself, the last time you used
something, did you like it or not?

Did your coffee machine this morning, did you kind of have a good experience with it?

Or did it ask you for water and to clean the tray and to descale and to, you know, that's
all of the stuff that's on my coffee machine.

And I'm having like a meh meh relationship with it today.

So my user experience with my coffee machine is like, mwah, today.

And so I think I would just love if, and this is why I love this section so much, if we
could just, you know, all chill out a little bit and just make it less complicated.

Like, it's not crazy.

Like we're literally, we're making something.

from humans for humans, we kind of want them to be successful using whatever we're making,
either to get filthy rich or to literally solve a problem or ideally both.

But like, we just want to know if it works.

That's it.

It's no, like we don't care if it's, if it's physical or if it's digital or whatever.

We kind of just want to make sure, can they use it or not?

Can they do the job?

I think this is really why I wanted to turn this.

this newsletter into a podcast because I love this whole thing so much.

Moving on to the last one.

UX research is a one-time activity.

oh That's an interesting one.

It is an interesting one.

So doing any research at all is great, right?

But to really get the best out of anything is to do it continuously.

So for you to run your research the same way that you do sprints.

So after every feature, before every feature, once after post launch, is it after post
launch or is it just post launch?

Because after

post-launch is post-launch, right?

Yeah, just post lunch.

Okay.

yeah, continuous research enhances the product's relevancy over time and ensures that your
customer's story relates to what you're putting out.

Any new feature that you develop is able to be tested.

Any new feature that you're thinking about can be evaluated with your customer.

These are all the benefits of continuously doing research and embedding it into the
practice of

whatever product is being built or created.

I think this one really turned around for me when I stopped thinking of UX research as an
additional service that we could offer optionally and just integrate it.

The whole thing into the design process.

I don't even, I think when I put together an offer, I don't even, there's no way to not do
it.

Like it's kinda, and I don't even make a big fuss about it either.

Whenever we design something, we just do research.

Yeah it just is what it is right?

It absolutely is.

you know, most of the times I know I sound like a broken record on this one, but it's just
so true as a designer and a front end developer, like as someone who literally has to put

stuff together in the end, like to make the thing.

I suffered so many times, especially in the design process from when you just have a blank
canvas and you're staring at it and you kind of know

the project description, of course, like you have the plan, but you got to start
somewhere.

And to have been in user interviews before that, like, or a usability test or a discovery
research session or anything where you would kind of have the opportunity to kind of get

into the topic already a little bit more to immerse yourself in the whole structure of
what you're trying to express in

in buttons and text and images and a layout or whatever is just so helpful because like
the people told you before what they will be trying to do with whatever you're making.

So it's so much easier to prioritize that, to just put it on the screen and be like,
what's this?

What they were thinking about?

Kinda, I guess.

it's, you know, like to just have this first draft to iterate on then is just super
helpful.

ah Let's move over to segment number two, to research methods.

I think we touched upon a couple of them while discussing the myths already.

Do you want to take us through the ones that you had listed here and maybe just add a
little bit of a, like to, I think to formulate like a goal.

What is out there even?

First of all, I think that will help to demystify the whole thing on a top level.

And then maybe we can even

to get people doing it.

think that would be, I would love dear listeners for you to go out and just pick one and
do it.

That would be awesome.

So just to give some context, when we as UX researchers are conducting any sort of
research, the first thing that we do is we pick a methodology based off of what we're

trying to achieve.

So there are two main sections, which is quantitative versus qualitative.

So quantitative deals with numbers.

So that's when you want to get hard facts.

You want to get numbers to back up any decision that you're making.

Qualitative

deals with the why and the reasoning behind anything that you're trying to find out.

So when you want to know why is it that your customer is doing ABC, that is when you dive
into qualitative.

Some of these methods use both.

Some of them only use one.

Some of them just use the other.

So that's what I'll be going into with these methods.

I'll be describing what is it best used for, what sort of method is it.

as well as an example of its application.

So starting off with surveys.

So surveys are a quantitative method.

This is when you want to get the numbers behind an action.

You want to get the numbers that will back up any decision that you're making.

And it's best when you have a large

group that you quickly want to get information out of.

So an example of this application would be a checkout process.

So when you want to get feedback on the usage of the feature and just get numbers and
statistics behind this process.

we'd use a survey.

Is this the typical how did you find this experience like from 0 to 10 % or like how.

Exactly.

you'd have, like you mentioned, the zero to 10 ratings, you'd have yes or no, you'd have
open-ended, you'd have closed-ended.

m But for you to be able to get as much feedback as quickly as possible, you'd have
closed-ended questions with options that you provide for your users so you're able to

tally it in Excel or whatever other data processing platform that you use.

Gotcha.

And so this is usually after the fact, like they did something and then you come with a
survey.

can be both, can be before, trying to understand users' interests, trying to understand a
situation, and it can be after, after they've used your product and you want to rate it or

see how it works.

So a survey is like, could just be your general type form, like something, right?

Or Google forms.

Like you just come up with a bunch of questions, send it out to a group, and then see just
on a number basis, how many people did this and that answer this question this and that

way.

Exactly.

Next we have interviews.

So interviews are a qualitative method.

This is best for understanding the why behind a behavior, understanding why your customer
does a specific thing.

So the value in this is that you're able to uncover pain points and get the emotional
responses and nuances behind any insights that your customers might have.

So interview, I mean, that kind of implies that it's live, right?

Like what we're doing.

So it's a dialogue.

Is there also like a way to do this asynchronously or something?

Yes, they are.

So I've seen a lot of apps that are coming up where they would give you a question and you
read it out and you give an answer.

And I think that has its place.

Right.

So it's fast.

You're able to get an answer.

But what I really enjoy about moderated interviews is that you're able to probe and have
follow up questions and pick up on specific things or even.

the choice of skipping a question entirely when you see that it's not relevant.

So there is a space for both moderated and unmoderated interviews.

unmoderated interviews, nice.

All right.

Yeah, yeah.

I was just thinking that it's almost a crossover between surveys and interviews.

Like it's like a video interview or whatever, because like what I really like about video
or even audio is since it's human to human, like you can kind of try to understand what's

between the lines a little bit better than if it's a written answer, right?

Yeah, cool.

Moving on to usability testing.

So this is my favorite method.

I really enjoy it.

You're able to get both qualitative and quantitative by saying that you're able to get the
why and the reasoning behind the decision, but you're also able to get the quantitative of

Five customers struggled with the specific feature that we're trying to work on.

Two customers thought that you need to click on the CTA instead of the CTA.

So you're able to get the numbers behind any sort of reasoning that you have.

And with the usability test, this is when we have a feature that we're trying to test and
we have our users click through it and interact with the design and giving us the most

honest opinion whilst we're able to observe.

what are they doing and what are they struggling with and what are they able to do
correctly.

Yeah.

I think biggest surprise in the last couple of years for me and with regard to usability
testing is I thought this only works on the finished product.

you, right?

But it just doesn't.

you can, like we usability tested the crudest blueprint napkin drawings on, right?

Like as long as you kind of...

You you describe the premise like this is a website.

You came here from an ad and like, we just want you to see if this sort of makes sense to
you.

And it was like a rough layout and well, text and that already started to work.

And not only a little bit, like people, like I was really surprised by the level of
engagement people had with the wire frame.

Like it was just a black and white drawing and they were still like,

Well, you're saying this or that interesting.

Yeah.

Okay.

Well, that doesn't make any sense.

So I get it.

Well, I'm going to scroll a little bit and I'm like, amazing.

Yeah, it's really interesting how participants are able to interact with anything.

It can be a drawing on a piece of paper and you can take that into usability testing and
you will work out with information that really helps to fuel the designs that you're

creating.

gets his thinking sometimes, you know.

It does.

like also in a way that if you have like a really good napkin drawing, and they're like,
this is super awesome.

Let's you know, but it's just a drawing.

Should I just stop working?

Okay, cool.

All right.

So moving on to card sorting.

So card sorting is a quantitative method.

This is where you get the numbers behind categorization.

So when I say categorization, card sorting is used for when you have a design or you have
information architecture and you're trying to categorize it into a way that matches your

user's mental model.

So an example that I've used in the past is we're creating a mega menu and we didn't know
how users categorize certain elements and where they sort fitting into this mega menu.

So by conducting this card sorting exercise, users were able to sort categories into these
little menu options.

And that helped ensure that whatever we're showing on our information architecture,

matches the user's mental model.

Amazing.

Yeah.

I was just trying to think of an example, but that's a super good one.

I can totally see that just being super effective.

what do you, and what would you say?

Because in the beginning you said quantitative methods need more participants.

what would you, how many participants would you try to get for the mega menu example?

That's a really good question.

That's a really good question.

So I remember from that study, I think we looked at between 20 to 50 participants to be
able to create this design with confidence and go ahead with it.

before that, it just might be a little random or like you won't be able to see like clear
clusters or trends or something like that.

I'm looking at the next one already and I'm really excited to talk about that one.

Yeah.

So this is ethnographic research.

This is a qualitative method.

So ethnographic research is when you immerse yourself in the user's natural environments.

So an example of that could be following, job shadowing a participant for the day.

So an example that we've had with one of our clients was that they are a POS provider.

And one of the ethnographic research studies was that the product owner at the time job
shadowed the sales reps to see how do they sell their POSs.

So running into stores, listening to their script, seeing how do they find merchants to
sell to, what do they say?

What do they take out of their bag?

How do they write down notes?

How do they pitch?

How do they follow up?

Just the everyday life of the user.

and covers insights that you can't find in controlled testing.

When you're in the natural environment, that's when you're able to get all the nuances
that you miss when things are controlled and neat and tidy.

You're able to see the messiness of running from one business to the other.

To say it's helpful to the method if at some point they could you probe first, they could
you kind of go in and say like, excuse me.

Um, could I maybe ask like a little question just, or is this disturbing the whole
premise?

Yeah.

My understanding of this method is to stay as invisible as possible and not to interact
with what is happening and to stay in the background and just fully observe what you're

seeing.

So to me, I would not get involved.

Rather, I would ask questions afterwards.

But when you're observing the actual interaction, just to take as much notes as possible
and probe afterwards.

Alright, last one.

Right?

Yeah, yeah, last one heat maps.

this is how you're able to wait first.

is a heat maps are a quantitative method.

You're able to visualize and see where you use clicks and scrolls and what do they have on
what are their interests?

What are they clicking on the most?

So usually how this looks like is that you'll have a screen and you'll have different
areas that highlights green, that highlights orange and that highlight red.

that indicate what went well, where is there any churn, where is there any drop-offs,
where is there any problems that are on the website.

So as someone who's, well, built a bunch of websites and worked in specifically startups
tend to have this sort of stuff.

So twofold question, kind of the heat maps thing.

That is something that I, that I saw being included as like an analytics platform on
websites, right?

Like where you can, you have the script and then you can log in somewhere and just kind of
see this heat map overlaid over to your website.

That is what this means, right?

Yeah.

Awesome.

And so the second part of the question is, what would you say, like is Google analytics,
is that a research method also?

Yeah, yeah, I would say it is.

Because like we have, so, um, that's something that comes up often as a reason for us to
conduct a UX audit.

So people come and say, look, we have Google analytics, we have heat maps, we have like
all sorts of quantitative data, but we don't know what to make of it.

Like we, can, we see that people like the bounce rate is high.

We see people churning.

We, we can see all of that, but we don't know why.

You

Interesting.

I love that you bring that up because that's where we see the flaw of quantitative data in
that you're able to see where the problem lies, but you don't really know why.

It's when you use qualitative methods to understand the nuances, to understand the reason
why, that's when you'll be able to get the answers that you're looking for.

That's when you'll know, okay, this is why this is a problem.

Because for you, you might see that the CTA leads to lot of troubles, but you don't have a
clue as to why.

You only know that the CTA, it should work well, but it's when you do that qualitative
research that you're able to find out that, users who are colorblind do not see.

this CTA and that's when you're able to actually change the color as opposed to maybe
removing it or replacing it, you know.

So I'm a huge advocate for using both qualitative and quantitative methods to find out the
numbers as well as the reasonings behind certain issues that you might have on your

website.

close up this section with like some super pragmatic action advice on what to do with each
of those methods.

Let's say you're a product owner, you're looking at the quantitative data that you have
available to you and you're just stumped.

you're looking at it for weeks and you're like, why, why is this not working?

Like to stick with your example, there's a call to action and people just go away.

Like bounce rate is high, the, they, see that they go to another page and are not clicking
on this button.

What would be your like rule of thumb advice?

Like what to do now?

You know what?

the quantitative data has helped you figure out what the problem is, or at least find or
surface this problem using the demographics of this quantitative method.

So let's say, for example, users between the ages of 60 and 65 during the times of nine
and 10.

That's when there is significant drop off.

That is when you'd go and you'd find users who are between the ages of 60 and 65 and test
them in these times that has been highlighted that has the most significant drop-offs.

In that session, when you're asking them these questions, you might find out that they're
elderly, that when they're using this website, they are at night time or they're on the

move.

so they don't have time to really pay attention, but they always come back.

These are different nuances that will help you make an informed decision as opposed to
being stuck with this information or this problem and not knowing why.

Also, another really great point when it comes to quantitative data is that it allows you
to surface the problem to your stakeholders who are probably execs and don't have time and

work with numbers.

And when you highlight that, look, I have this problem and I want to follow up with
qualitative research, you're able to understand why you followed up with the specific

qualitative information to get this problem solved.

So it's both a tool to gather more research as well as a tool to understand nuances behind
this problem.

And would you suggest like a specific method from the list of qualitative ones that we
have?

You have this problem, you kind of know where it is and what it is.

Let's go interview some people or like what's the, you know, like what's the first step
that you should look into or what's the like the most popular options to start to get

somewhere, I guess.

So it's case by case, depending on where you are in your product cycle, depending on how
much time you have, depending on your resources, depending on who you are going to be

presenting to.

All these play a part when choosing your method.

So using all that information, you're able to take a decision.

Luckily, we have created a quiz that helps you take these decisions.

that takes into account what cycle you are in with your product design, who you are
presenting to, how much time you have, what your main goal is.

And this quiz allows you to at least guide you into what method you should be choosing.

Awesome.

Perfect.

We'll make sure to link that up in the show notes so you can just go and grab that and be
very interested to see what people do.

But generally, think for me, what really helped was to literally in any way, not freak out
about the method so much at all, but to just remember when you can't know from your Google

Analytics data, ask someone.

Who's trying to use your website.

And if you, if you can't find anyone and you want to know it now, then just get someone
who doesn't know anything about your problem really, even if they're on another department

in your company, just go to them, sit them in front of computer and show them what you're
trying to understand.

There might be something.

And of course, ideally, like you get people from your customer group exactly as you
described.

right?

Like get someone who matches the demographics and so on.

There's platforms online to source people who can then be your test participant.

And it's really not expensive either.

And I think it's fair to say it will definitely be worth your money.

So yeah, perfect.

So that was a lot.

There was a lot of stuff to take in, but also such a great overview.

I learned a lot, even though we work at it every day, I think it's still

Very interesting to think about the fact that we're just making something for humans and
that we're all not machines and we, yeah, it's kind of, it's always just a little bit

fuzzy.

And so with that, maybe we, it's really interesting to end this podcast with a little
discussion on what does AI have to do with all of this?

Right?

Like I have the feeling there's so much discussion about the fact that we're human and the
AI is not human.

I don't know, there was this other headline of, um, of this company called Character AI,
who is, I think, trying to, they're even fighting in court.

it's, I might be super wrong, but there was like a case where, um, I think Character AI
was trying to get one of their, like the characters to be legally

represented as a person.

And so, I don't know, I think the lines are blurring so much, right?

Like chatbots are everywhere.

People are falling in love with them.

Everybody seems to have a chat GPT account.

So I don't know, in the light of UX research, what's your current take on all of this?

Yeah.

So I think like many people, we don't know.

We don't know what could happen.

We're unsure of will it replace certain practices, will it replace certain jobs?

And my stance is that I think that AI will in fact enhance any sort of research.

It will make research faster.

It'll make it more accessible.

It'll make it more scalable.

However, there's no way that you can remove the human impact that comes with being a
researcher or sorting through this information.

So one of my arguments is that AI will transform research in the means of speeding it up
with the help of documenting these findings, pulling out evidence, recognizing patterns.

as well as just the whole logistics of research, which is often quite time consuming.

I could definitely see AI assisting in that often painfully long process.

One interesting thing that I heard about UX research and AI is so I think there was
someone talking about a company where you basically do usability testing with AI.

So like you're not talking to a real person, but you're kind of you're doing your
interview thing with an AI.

And I think the back thought is like there, an AI is

trained on everything that humans know and so on.

And then there was this gigantic discussion, of course, like immediate division into two
camps, like the ones that are all for it.

And they're like, well, this is probably good enough for what we're trying to understand.

And the others were like, but you're not talking to human.

Like the AI is not going to use your product.

Any thoughts on because I'm like, I don't know.

Yeah, that's so interesting.

Wow.

I think that that's a lot.

So I think that I do agree that human beings are very complex and multifaceted and very
unpredictable.

And I think with AI, it's informed by a specific action and is quite predictable and
sometimes just makes up things.

So that just

aren't true.

And it's very problematic to take everything that comes from AI as as the truth, because
these things are created with biases in mind and created by human beings who have specific

biases.

And that can become very dangerous very quickly.

Yeah, so I don't think that's a great way to utilize it.

I think it's always important to speak to human beings because human beings change, they
evolve and their thoughts aren't always going to be the same.

I also, like, I feel the premise is kind of wrong.

I don't know, but at least a little skewed, you know, like, so when you said like, so AI
hallucinate and make up stuff and people are also sometimes a little random and like it

might seem that those two things are sort of related, but I think they're just very
different in from what it really stems from, right?

So humans are like this amazingly

multi-layered construct of emotions and logic and, you know, like all of these things that
play together to see something like that today and to maybe feel different about it

tomorrow.

Right.

Hallucination of AI just comes from somewhere else.

Right.

Like it's not the same origin of entropy, so to say.

And as such, think it's super dangerous to, like, I can kind of understand that you're
trying to use a shortcut, right?

Like that you're trying to do this thing where I think that that's also the appeal of all
of this agentic thing, that you have an agent doing something for you.

But especially in research that you're trying to get.

Well, 24-7 access to a person to do your research as quickly as possible.

You don't have to schedule, you don't have to pay them or wait or then they're sick or
whatever.

Like all of these inconveniences, you don't need them.

But in this case, you don't want them to execute a task for you.

Like you literally want their opinion.

Steer them anywhere.

That's the backwards part for me.

about this whole discussion where I think the principle of AI moving towards being our
agents who can perform tasks is not what we want, at least not in the context of user

research, right?

Yeah, yeah.

Also, I think humans build things for other humans and you need to have that input of
humans at one point or the other.

And I think it's really valuable to have that at user research that you're able to know
your clients, you're able to get that input straight from an actual human who you're

building for.

And unfortunately, AI just can't.

give you that nuanced, complicated, multifaceted opinion that you're looking for from
human beings.

You also mentioned bias as a problem here.

And so I think we, most of us might have heard that because of the training material that
is used to train m an AI model, that also AI has certain biases m towards what they put in

there, in there, in the

generated output that they have also now in the context of AI versus humans as test
participants in user research.

So I mean, your test participants probably also have a bias in some way or the other.

Like how would you compare these two things m in this context?

That's such an interesting question.

So I think with humans, you're able to sort of guesstimate some of the biases that they
might have when they're answering your questions.

let's say, for example, you're testing a mother with kids, right?

And you understand that she has biases because she currently has a child and there's
certain things that go into raising a child when you're answering.

where she's answering specific questions, right?

But with AI, you don't really have all that context.

You don't have all that nuance that helps inform or decipher an answer that you're
getting.

You don't know what is the reasoning or you might get what is the reasoning, but you don't
like have it in perspective.

and it just lacks that nuance that you're able to get when it comes to a human being.

Also the way in which I found that AI delivers information, there's something about it
that makes you think that it's fact and you don't really question the information that

you're getting.

Whereas I feel like when you're to a human being, you're studying this person, you're
looking at their body language, you are seeing certain quirks that only a human can see

when they're speaking to another human.

And even if you are speaking quite factually, you're able to at least discern specific
things.

And with AI, you can't really get that.

It's really tough for you to discern and

try and understand and pick certain responses apart.

Yeah.

Exactly.

Also, every time that I've used AI, I've always been agreed with and I've always had
gotten like a yes as an answer.

Even when I ask for like a brutally honest response or any sort of like disagreement,
there's just this agreeability that's innate that you don't really get when you're

speaking to a human being.

Yeah.

uh

Never talks back, says no.

Awesome!

Well...

I hate saying no.

Yeah.

Amazing Nils, thank you so much for having me.

It's always fun to be here.

The Timeless Principles of UX Design — Interview with Nosipho Nwigbo
Broadcast by