← All posts
·5 min read·Chris

Why Flint doesn't auto-submit to jobs.

Sonara users report 50%+ submission failure rates. Recruiters blocklist the pattern. Auto-apply is a reputation spiral the tools can't reverse. Here's what the category actually delivers, and what we're doing instead.

Every few weeks someone asks me why Flint doesn't do auto-apply. The pitch is seductive: a bot submits 50 applications a week on your behalf while you sleep, and the funnel math does the rest. If applying is just a numbers game, automate the numbers.

The product category answering that pitch includes Sonara, AIApply and JobRight's "Orion" agent. Sonara is the purest version: you upload a CV, set filters, and they mass-submit for you. AIApply and JobRight lean on browser automation, pre-filling application forms or submitting after you click. All three have been marketing hard through 2025 and 2026.

I've spent the past month reading public reviews, recruiter complaints and the tools' own reporting. I'm not building auto-apply into Flint. Here's what the numbers say.

The submit-rate problem

Sonara's own Trustpilot review page is the clearest primary source. Dominant pattern across recent complaints:

  • 50%+ of submissions fail to send. Users pay a monthly fee for a weekly apply quota, a big chunk of which never lands.
  • 80% of the ones that do send hit a 2FA wall. Greenhouse, Workday and most modern ATSes now require email confirmation or phone verification on unfamiliar sign-ins. The bot gets that far, can't finish the step, the application sits half-submitted.
  • 25-40% of submissions reach the employer but contain empty or wrong fields. One reviewer: "the AI filled 'years of experience' with 'I am passionate about learning'." Another had applications sent in languages they don't speak because the geo-filter silently failed.

Scale.jobs published a whole analysis titled "Is Sonara Hurting Your Job Search Without You Knowing?" The answer their data suggests is yes. Once an employer's ATS flags an applicant's email as source of spam-shaped applications, subsequent manual applications from that email get filtered before a human sees them. You can make your own career worse by using the tool.

AIApply fares similarly. Wobo's independent review scored it 3/5 with the specific finding "Auto Apply feature is slow and inaccurate." Trustpilot flagged AIApply's profile with a warning that the company "may be using unsupported methods to collect reviews", which makes the 4-star public average itself suspect.

JobRight's "90% automation" marketing claim hits the same wall in practice. Their Orion agent works well for reasoning about a match. The auto-apply part is in permanent beta, and users report it fails on the exact ATSes most large employers use.

Why it fails

Three technical reasons, then a human one.

ATS 2FA is the hard blocker. Workday, Greenhouse and Lever all rolled out anti-bot measures in 2024 specifically aimed at auto-apply tools. The friction is deliberate. Any bot that clears it is one Greenhouse release away from being blocked again. This is an arms race the employers are going to win, because they only need the friction to be slightly higher than a human's tolerance. They don't need to stop all bots; they just need to stop the cheap ones.

The profile-to-role match is shallow. Auto-apply tools use keyword matching on a CV and a job description. That's a 2015-era approach. They don't know that your five years of "Stripe integration" means you've built three of them end-to-end, or that your "led a team of 12" line is padded. A recruiter scans for that texture in eight seconds and your application goes in the bin. My post on CV matching went into why keyword density isn't a meaningful signal.

Cover letter generation at volume is detectable. The LLM-generated cover letters produced by these tools share stylistic tells that pattern-match obviously to any recruiter reading more than 10 of them a week. Recent telltales include the construction "my passion for X aligns with Y" and the phrase "I am excited to apply for the opportunity." Scan any recruiter's LinkedIn in 2026 and you'll find recent complaints about them.

Then the human problem. The candidates winning interviews in 2026 are not the ones submitting the most applications. LinkedIn's own 2025 data shows the top quintile of applicants by interview-conversion rate apply to 30-50% fewer roles than the average. They apply with more signal, not more volume. Auto-apply optimises for the wrong metric.

The recruiter side

Recruiters talk about auto-apply openly on LinkedIn now. The consistent sentiment is that they can spot auto-submitted applications within seconds and they bin them faster than they bin ChatGPT cover letters. A senior recruiter at a FTSE 100 employer told me in March that her team maintains an internal "spam patterns" list that now includes ~15 telltale signatures from Sonara and AIApply specifically. Once your email domain matches a pattern, future manual applications from you get deprioritised.

This is rational from the recruiter's side. They're protecting their time. And it's the thing no auto-apply tool's marketing page will tell you: the cost of the mistake isn't just the failed application, it's reputational damage that outlasts your current search.

What we do instead

Flint does the matching. You do the applying.

The six-dimension score tells you which roles are worth 20 minutes of tailored effort versus which ones are waste. The cover letter generator is there when you want to accelerate a specific application you've decided to make, not to carpet-bomb fifty of them. The pool is UK-focused because the auto-apply category is US-dominant and their ATS coverage for UK employers is thin anyway.

Concretely:

  • You open Flint. Five jobs this week scored 80%+. Three are at companies you'd genuinely work at.
  • You read the scoring breakdown: skills match is strong, seniority perfect, one gap on a specific tool.
  • You spend 30 minutes writing a cover letter (or let Flint draft one that you then edit).
  • You submit one thoughtful application, not fifty careless ones.

The trade is simple. You spend 30 minutes per application instead of 30 seconds. You apply to five or ten roles a week instead of fifty. Your per-application interview rate goes up by a lot. Your total interview count goes up by enough to more than offset the reduced volume.

I wrote Flint to help with the reading, scoring and drafting part. The click-submit-take-the-consequence part stays yours. That's not a feature gap. That's the product.


Flint Pro is £7.99/mo. It scores every UK job against your CV across six dimensions, generates tailored cover letters when you want them, and stops there. No auto-submit, no browser extension, no way to accidentally blacklist your own email with an ATS.

Stop scrolling. Start scoring.

Flint pulls jobs from 100+ company boards and scores every one against your CV. Upload once and see what actually fits.

Try Flint free