rw-book-cover

Metadata

Highlights

  • When constraints in a system change, and what was previously scarce becomes abundant, value shifts within the system. Some forms of labor lose value as the underlying constraint disappears. Others gain value precisely because they resolve what the new system cannot make abundantly accessible. (View Highlight)
  • Much like cold chain logistics removed seafood scarcity, AI changes the scarcity associated with certain types of knowledge work. But in doing so, it will expose new bottlenecks and constraints. These are points in the system where trust, context, and interpretation become the new scarcities. Some jobs will vanish or lose value. But a few, like the fugu chef, will paradoxically gain value. (View Highlight)
  • Every role in an organization exists to resolve a constraint. This is easy to miss because we associate roles with fixed titles of engineer, doctor, or teacher. But every such role exists to move work along and what’s really preventing work from moving along is some form of constraint. Remove or shift the constraint, and the logic for that role starts to break down. (View Highlight)
  • The role unbundles when tasks that needed to be performed together no longer need to coexist in the same role. Sometimes this is driven by technology. When automated checkout systems eliminated the need for cashiers to handle both payment and bagging, the cashier wasn’t displaced but their role was unbundled, and they had to migrate to other tasks like troubleshooting customer checkouts or upselling items. Other times it’s structural. When Covid-driven remote work decoupled collaboration from location, managerial roles were unbundled and rebundled around collaboration tools like Miro, Slack, and Notion. In both cases, the role as a bundle of co-dependent attributes falls apart. (View Highlight)
  • The mistake people often make in responding to AI is assuming they are competing with the machine. Instead, they should be asking:

    Where is the machine creating a new constraint that only a human can resolve? In other words: Don’t just look at what AI can’t do.

    That will pit you in a race against an ever-improving machine.

    Look, instead, at what it breaks in the system. (View Highlight)

  • If an AI can generate thousands of marketing variants per hour, the constraint shifts to human discernment, who decides what’s likely to land emotionally? If an AI assistant can draft legal memos, the constraint moves to the lawyer’s judgment in spotting where the model has overgeneralized or hallucinated. (View Highlight)
  • With improvements in AI, the role of the radiologist has changed, who no longer gains value by identifying abnormalities in scans alone. AI can do that with remarkable accuracy. Instead, her value lies in interpreting corner cases, communicating risk to patients, and working in case groups with other specialists to resolve complex cases. (View Highlight)
  • Such role shifts happen whenever the system changes. The arrival of GPS-aided navigation transformed the role of the driver from someone who plans a route to someone who looks out for exceptions where the machine gets it wrong and interprets when and how to act on the machine’s guidance. (View Highlight)
  • You might be fascinated with everything AI can increasingly do. But the roles of the future are found in the coordination gaps and new constraints created, not despite, but precisely because of, AI’s relentless execution. (View Highlight)
  • Intrinsic value is about meaning; economic value is about exchange potential. (View Highlight)
  • Contextual value is a measure of how crucial a task is to the performance or stability of a larger system. It’s not about whether a task is hard to do or emotionally meaningful; it’s about whether it is structurally indispensable to a system operating under certain constraints. (View Highlight)
  • workflows are executed, certain tasks within those workflows lose contextual value and others gain it. Data labelling in machine learning is a great example. Prior to the early 2010s, tagging images or annotating text held virtually no value. It was viewed as a low-skill, manual chore. Arguably, it had low intrinsic value. But when supervised learning models became the dominant approach in AI, the constraint in the system changed. The bottleneck was no longer in computing power or model architecture, but in access to labeled data. Annotators now determined model accuracy. Their work carried high contextual value, and determined the performance of the larger system. Even if the intrinsic value had not changed, the contextual value was now much higher. (View Highlight)
  • As we reimagine our work alongside AI, we need to look for roles that command both high contextual value and high economic value. Economic value ensures you get paid well. And contextual value ensures you are critical to the new system. (View Highlight)
  • The most valuable roles are those that command economic value while ensuring high contextual value. These roles are rare and not easily displaced. They are, accordingly, paid very well. (View Highlight)
  • But as digital infrastructure matured and algorithmic trading took over, machines which didn’t get tired and never made emotional trades started outperforming human traders. Value shifted as traders who had once thrived on reflexes now found themselves redundant. In their place, new roles of algorithmic strategists and behavioral signal analysts came up. The tasks of buying and selling were still central to the system, but the human role had transformed from performing the trade to interpreting the market system and identifying second-order patterns the models couldn’t see. (View Highlight)
  • Judgment becomes more valuable in a world of frictionless execution. This is why misplaced prophecies about radiologists losing their jobs and becoming redundant don’t actually play out. Radiologists who were paid to examine scans and identify anomalies realized they had a new role when AI could identify tumors betters than them. Their role had shifted to judgment - to deciding what the scan meant in clinical context and what appropriate next steps should be triggered given the larger context of the patient. Machines were better than radiologists at image classification. But radiologists now have high contextual value in the new system, thanks to their clinical judgment and understanding of the larger patient context. (View Highlight)
  • These are examples of role migration where the new role has high contextual value as well as high economic value - the ideal place you want to end up in. And you don’t get there through ‘reskilling’. You get there by understanding what the AI cannot understand - the larger complex system within which it is operating. (View Highlight)
  • What ties all of these examples together is a shift in the decision constraint. When data and information are abundantly available and agentic execution takes over slow and inefficient human execution, knowing what to do based on how you interpret the facts is increasingly valuable. The new roles that emerge are rebundled around judgment. (View Highlight)