top of page

The erosion of signal in a sea of slick, soulless noise


This image was sponsored by Canva Dream Lab - because I can't draw robots.
This image was sponsored by Canva Dream Lab - because I can't draw robots.

I recently posted a job on PPH looking for a freelancer to help me with some mock-ups. The spec wasn't complicated if you're experienced in designing for business leaders, can translate real world requirements and tell me what I needed not just what I asked for. I wanted someone who understood the ask and who was willing to work collaboratively with me on it.


I had tons of responses that, on paper, sound perfect. They had addressed every point, showing they understood WHO I was looking for and what I needed them to deliver.


The frustrating thing is I could tell they were fake. Not just all the em dashes (—) and Oxford commas but the tone. The generic language. The lack of personal touch. Sometimes factual errors and sometimes weirdly, a pitch that hit every single point in a way that felt... inhuman. ChatGPT at its finest.


I asked for collaboration, insight, human judgment.


Instead, I got an inbox full of algorithmically optimised mirrors, reflecting back my specification with eerie precision but none of the nuance, creativity, or messiness that real partnership involves.


Some of the profile photos were even AI generated! Designed to make me buy what they thought I and others would want to see (though some of that said as much about our collective bias as it did about their fakery). Some portfolios looked great, once you sifted through the duplicate examples, or the fake profiles handing each other five-star reviews.

The skills may have been there. But by then, the trust was gone.


Trust isn’t built on perfection. It’s built on presence. On the little signs that someone read my brief and thought about it. That they have their own voice. That they’re real and reachable. They gave me nothing that helped me to know the person behind the portfolio and importantly, whether I could work with them. In some cases that would be ok, in this case I wasn't just buying services from... an algorithm. I was buying collaboration, insight, chemistry. And chemistry isn’t in a bullet point or a Figma example. It’s in the unexpected questions someone asks, the challenge they pose to your assumptions, the spark of, “Wait, what if we flipped it this way instead?”


When Everyone Sounds Perfect, No One Feels Real


It’s not just freelancing. Dating profiles? Check.


Resumes and cover letters? Check. Check.


No wonder the job market is going to hell. How can a recruitment manager work out who to shortlist when everyone sounds the same? It's no wonder they're employing AI to do the initial sift... Oh wait - can a robot tell they're talking to another robot!? (And other comedy, coming soon). When did we cross the line from using these tools to help with work to letting them replace the effort entirely?


What’s unnerving isn’t that people are using AI to assist - I do too. It's that they’re using it to perform a version of themselves. And the performance is too perfect. There’s a rhythm to it. The polished-but-generic syntax. The slightly-too-on-point language. So now, ironically, when everyone sounds “ideal,” no one stands out. It becomes harder, not easier, to choose.


I am no luddite. I'm excited by where tech is going and I believe it has so much good to give. I can't help wondering if we cross a line though, not just technologically, but ethically, in how we show up. Tools that were meant to support thought are now mimicking it. And in that mimicry, the very things that make people hireable, dateable, or trustable - our real voices, lived experience, perspective - are being airbrushed out in favour of plausibility.

We’ve long been outraged at how magazines airbrush bodies, knowing the harm that does, particularly to young people trying to make peace with their own image. I tell my daughter constantly - “Don’t compare yourself to something unreal.”


But now we’re doing the same thing with ideas. With selves. With work. We’re shaping expectations not around what’s true, but around what reads as credible.


And credibility without authenticity? That’s just another filtered feed.


And no, according to ChatGPT (who I now call "Dave"), a robot can’t always tell when it’s talking to another robot but a human still can if they’re paying attention.


"So what?", someone asked me mid-rant.

So what if the profiles are fake, if the work still gets done?

So what if the words aren’t “real,” if the result is usable?


Here’s the “so what”:

  • Trust dies in this grey zone. 

  • Differentiation gets crushed by sameness. 

  • Creative collaboration evaporates when the partner on the other end has nothing to push back with. 

  • Reputation becomes harder to build when no one knows if your testimonials, photos, or voice is even yours. 

  • And the kicker? Apparently competence without character is replaceable. If they’re faking the human part, they’ll be replaced by someone who fakes it better or eventually, by the tool itself.


It's ok to want efficiency, accurate insights from your data, automation where it counts and more but savvy clients are already craving the real when it comes to interaction. Just look at all the businesses trying to get people back into the office. We want real insight. Real dialogue. Real craft. Not just outputs that tick boxes, but experiences that leave fingerprints.


Reminder: People buy from people. Human Connection Matters.

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

© 2023 by Veraxis Consulting Ltd. Registration no: 15499328. Registered Office: Profile West Suite 2, First Floor, 950 Great West Road, Brentford, TW8 9ES

PRIVACY POLICY   l   COOKIE POLICY

bottom of page