In the summer of 2023, I started my life over again. I packed up everything I owned into my Volvo sedan and drove across the country. This meant leaving my friends and family in New York City en route to a new life in San Francisco. I needed a new job, a new apartment, and new friends. I spent my days traveling through endless corn fields and my nights sending cold emails on Craigslist and LinkedIn. By the time I arrived on the West Coast, I had learned a valuable lesson: there’s something very wrong with these websites.
Digital tools have supposedly made finding anyone and anything easier than ever. There are websites for finding jobs, roommates, and even life partners. Facebook housing groups have tens of thousands of members, and there are unlimited jobs on LinkedIn and potential romantic partners on Hinge. But on that summer road trip, I learned what anyone who has gone down the rabbit hole of right swipes and quick-applies learns the hard way: these systems do not work.
In finance, liquidity is an unalloyed good. Growing the pool of buyers (employers) and sellers (job seekers) closes the gap between expectation and reality. In theory, this allows everyone to get what they want. But in formerly trust-based domains like dating and employment, flooding buyers and sellers with more options than they can count has had the opposite effect. Rather than making it easier to find the perfect person, these tools have flooded the market with noise and made it harder to find what we’re looking for. In trading trust for scale, we’ve lost something critical. What went wrong?
Part 1: Problems With Liquidity
Spam Ask anyone who has posted a job online (or a heterosexual woman with an online dating profile) about their experience, and this is often the first complaint you’ll get. The easier it is to apply, the more spam you get. Attracting interest is easy, but most of it is low quality and not useful. Asking out a woman is hard, but sending a copy-pasted message on an app is easy. This incentivizes participants to play a numbers game. Under these rules, volume is king and effort is mostly a waste of time. With more applicants than they can carefully review, buyers resort to blunt filters. This creates a vicious cycle: high rejection rates further incentivize high application volume.
Bad Information Resumes and dating app profiles are poorly suited to convey the traits that actually matter. They are good at capturing highly legible characteristics like school prestige and two-dimensional attractiveness. But they provide little to no information about things like compatibility, integrity, and investment level. This compounds the problem of spam by providing buyers with too much information, and none of it truly useful.
Low Trust Most people do not trust strangers. Anthropologist Robin Dunbar proposed that people could maintain stable social relationships with about 150 people. This is a few hours’ worth of applicants on the average LinkedIn posting. By expanding the pool of applicants by several orders of magnitude, platforms have eliminated the possibility of a first- or second-degree connection that can provide a measure of trust to the process. Everyone enters an interaction maximally suspicious of each other’s integrity, which leads to lower quality relationships.
These problems compound into a process that gives everyone less of what they want. As searching gets easier, finding becomes harder.
Part 2: Why Does This Happen?
Three frames from economics which explain why more liquidity across marketplaces does not always lead to better outcomes.
The Market for Lemons George Akerlof’s The Market for Lemons explains that markets with asymmetric information tend toward a predictable failure mode. Akerlof examined the used car market, where sellers know much more about the quality of their car than potential buyers. When buyers can’t distinguish good products (“peaches”) from bad products (“lemons”), they assume everything in the market is of average quality. Because sellers of high-quality products don’t want to sell at lower prices, they eventually exit the market. Over time, average quality declines until only lemons remain.
On job sites and dating platforms, asymmetric information manifests because quality signals (competence, compatibility, commitment) are poorly communicated by resumes and dating app profiles. Good candidates become indistinguishable from bad ones. Employers or potential partners adapt to this through skepticism and harsh filtering, further driving away serious candidates. Over time, the system becomes lower quality and more dominated by unserious or flaky candidates.
Signaling Theory Nobel Laureate Michael Spence’s work examined the role of incomplete information in labor markets. He observed the inherent information asymmetry between organizations and prospective employees. Employers need to distinguish between employees of differing quality but have limited information available to do so. Spence posited that, to establish their ability and competence, potential employees needed to send signals that were costly to obtain and difficult to fake. A key example of this was a degree, especially one from a prestigious institution.
While Spence noted this pattern in 1973, digital economies of scale have significantly increased the volume of applicants that organizations must review, and increased their reliance on signals that can be quickly interpreted. This leads to deep, but difficult-to-parse signals like cover letters and nuanced experience being deprioritized. In their place, overfit proxies like degree prestige, job titles, keyword matches, and brand name employers dominate. These are worse signals of competence, yet they are cheaper to detect.
Spence’s original insight was that signaling only works if it’s hard to fake. But in frictionless online environments, it’s never been easier to deceive. Anyone can copy and paste bullet points from job descriptions to game keyword tracking systems, fabricate their true role and skills, and inflate their profiles with over-optimized language. As candidates learn and exploit the rules of the game, the value of the underlying signal diminishes. The only people who can make it through this system are those that can aggressively exploit it.
Tragedy of the Commons Free platforms can be viewed as a sort of commons, in which employers, apartment listers, and dating partners invest their finite attention. Economist William Forster Lloyd first identified the "tragedy of the commons," describing how resources freely available to everyone could be degraded by individuals incentivized to overuse them. Each individual herder would be incentivized to take as much as possible from a shared field, but if everyone did this, the shared resource would be destroyed. These problems manifest when a collective interest in moderation clashes with an individual interest in overuse.
On platforms, individual incentives lead users to flood systems with low-quality messages, hoping to overwhelm long odds with sheer volume. But while rational, this strategy pollutes the shared resources of attention and trust, reducing the effectiveness of the overall marketplace. In order to compete with spammers, quality candidates must play the same game. This positive feedback loop continually degrades the attentional commons and weakens its utility as a shared resource.
Part 3: What Can We Do?
By facilitating easy connection, platforms have created a set of unique challenges that result from ease of access. Frictionless communication trades one set of limitations for another. If frictionless systems brought us this problem, reintroducing smart friction gives a path forward. Two ways to approach this are raising application barriers and inserting trusting intermediaries.
Charge People Money No Dumb Ideas went through the same application gauntlet I did and came up with an interesting solution: charging a small amount of money to apply for a job. This would reintroduce friction and make it somewhat less effective to spam applications. Universities already charge application fees (at time of writing, it costs $90 to apply to Stanford undergrad), providing at least a limited proof of concept. And many other platforms, such as X/Twitter and Meta, have embraced paid verification as a solution to reduce spam.
I have a few concerns about this system. First, it’s currently illegal in several states, including California. Second, it creates an obvious additional set of perverse incentives on the part of employers, though these could be mitigated by donating the money to charity. Third, it would be difficult to set the sum high enough to legitimately dissuade spammers while also being low enough to not punish low-income applicants. If employers charged a dollar to review applications, spamming a few hundred applications would not be cost prohibitive to any middle-class person. Increase it too much, and they risk losing interest from talented but financially vulnerable people.
Bring In a Broker Maybe what’s missing from the process is a skilled middleman. Most markets do not operate as a free-flowing chaos cloud of buyers and sellers. Financial markets have market makers, who act as an intermediary. Real estate agents, matchmakers, and recruiters already fill a similar role in their respective marketplaces. These parties reintroduce trust to the process by staking their own reputation on a smoothly functioning exchange. They can enforce standards and accountability and lessen information asymmetry, reducing the propensity for higher quality candidates to exit a market out of frustration.
At present, these services require significant time and attention from skilled human beings, making them expensive. This means they eliminate many of the inherent scale advantages of an online platform. In the future, this area seems ripe for disruption by AI, which could theoretically apply a careful and nuanced screening process while maintaining functionally limitless throughput. A recruiter-bot could go beyond keyword searching: asking applicants incisive questions, cross-referencing their social graph, and identifying behavioral red flags at scale. Some people would doubtless be unhappy about being subjected to such a system, but compared to the keyword-searching resume bots currently in use, maybe it’s not the worst alternative.
The current system for finding things online is broken. People apply to thousands of jobs, swipe right on thousands of partners, and mostly experience failure and frustration. Without filtering, the attentional commons has been co-opted by manipulators, bad actors, and spammers. In order to keep up, ordinary participants must adopt cynical techniques that perpetuate these systemic failures. Digital platforms have transformed the experience of searching, but they have also removed the trust and accountability required for meaningful interaction. Rather than being distracted by spam, we are becoming it.