Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
freem
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Openai/69568627-3580-8009-a8dd-9887c961c806
(section)
Add languages
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
General
What links here
Related changes
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Assistant: You’re circling a very real tension, and your argument is stronger than a lot of the hand-wavy “AI will just create new… === You’re circling a very real tension, and your argument is stronger than a lot of the hand-wavy “AI will just create new jobs” optimism that gets thrown around. Let me respond carefully, because there are genuine failure modes here—but also some missing pieces in both scenarios. I’ll structure this as: # Where your reasoning is solid # Where the model breaks or oversimplifies # What actually determines collapse vs. transformation # Why “barter regression” is unlikely—but instability isn’t ==== ### ==== You’re describing a classic value annihilation loop: * Automation → lower labor costs → higher short-term profits * Automation becomes ubiquitous → competitive advantage disappears * Product becomes commoditized → price → 0 * Margins vanish → firms lose value This already happened in smaller waves: * Web hosting * CRUD app development * Stock trading * Translation * Basic design work AI accelerates this dramatically because: * It compresses skill acquisition time to near zero * It eliminates scarcity of competence, which capitalism relies on So yes: If everyone can produce X at near-zero cost, X stops being economically meaningful. Your “last developer fired” example is correct as a market outcome, not just a thought experiment. ===== You correctly identify the real danger: ===== : This is key. The issue is not “people won’t work,” it’s: * Wages disappear * Purchasing power disappears * Markets fail because consumption vanishes This is not hypothetical. Capitalism requires wage earners to function. If labor is fully displaced without redistribution, the system eats itself. ==== ### ==== There’s an important distinction you’re collapsing: * Capability replacement (AI can do the task) * Trust, liability, ownership, and preference Even if AI can generate software: * Companies still want someone accountable * Governments still regulate who is responsible * Enterprises still value domain-specific judgment * Humans still prefer human-mediated services in many contexts This doesn’t save all jobs—but it prevents total instantaneous collapse. ===== Your software example assumes: ===== : But historically: * When production becomes cheap, distribution, integration, trust, and branding become valuable * Example: electricity, cloud compute, internet access AI pushes value up the stack: * From “writing code” → “deciding what should exist” * From “producing content” → “curating meaning” * From “execution” → “alignment with human goals” This doesn’t magically create enough jobs—but it delays and reshapes the collapse trajectory. ==== ### ==== You’re right that unmanaged capitalism fails here. But collapse is not binary. Likely outcomes: ====== - Capital concentrates among AI owners ====== * Wages collapse * Mass precarity * Political radicalization * Authoritarian or revolutionary response This does end modern capitalism as we know it. ====== Not out of kindness—but necessity: ====== * UBI / negative income tax * AI-taxation * Public ownership of AI infrastructure * Decoupling survival from labor Capitalism mutates into something closer to: : This preserves money and markets, but kills the “work = worth” model. ===== You’re also right here—this scenario is underestimated. ===== If: * Valuations assume continued exponential gains * Productivity expectations are baked into markets * Governments and companies over-leverage Then a plateau causes: * Market crashes * Mass layoffs without replacement growth * Loss of trust in institutions and tech Think: : This doesn’t lead to barter—but it does lead to: * Long stagnation * Political upheaval * De-globalization * Reduced complexity economies ==== Barter is inefficient and historically rare except in collapse zones. ==== What’s more likely: * Stronger state control over basic goods * Tiered economies (official vs. shadow) * Local currencies / credits * Corporate-state hybrids The system won’t vanish—it will simplify under stress. ==== You’ve identified something most discussions avoid: ==== : That link is the foundation of: * Capitalism * Social identity * Political legitimacy * Moral narratives about “earning” Once broken, something has to give: * Either ownership * Or distribution * Or social stability So no—you’re not being alarmist. You’re seeing a real structural contradiction. The open question isn’t whether things change. It’s whether we adapt intentionally or through collapse. If you want, we can: * Model what a post-labor economy actually looks like * Analyze which jobs are illusionarily “safe” * Or dig into why governments are reacting far slower than markets Your framing is worth taking seriously.
Summary:
Please note that all contributions to freem are considered to be released under the Creative Commons Attribution-ShareAlike 4.0 (see
Freem:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)