RE: What's Wrong With Free Money?
Larken, you know I love your ideas and sparing with you a bit on them. When it comes to UBI, I think there are many layers and it can be challenging to see past our own biases. I too thought as you do about UBI and MMT (modern monetary theory). I'm a big fan of Austrian economics (from what I understand of it) and the concept that value is only determined in the moment of transaction. It's all about perception. There is no "intrinsic" to it.
Most of your points here deal with how much a train wreck it would be if government ran a UBI and I'm in complete agreement with you there. That said, what if a UBI could be run via blockchain protocols?
I wrote about that a year ago: Does the World Need a Universal Basic Income? Could Steem Power It?
What I fail to see in your analysis:
Human beings, within our lifetimes, may be physically incapable of providing enough value to their fellow human beings to support themselves through their own labor and efforts. Automation is coming and it's not like the invention of agriculture where people will just move on to other things. There will be no other things. Those who are studying AI and automation know this. Read Nick Bostrom's Superintelligence for a primer on where this is headed.
Where does value really come from? Is it an emergent property? Have we, as a species, all benefitted by the combined efforts of millions and millions of people? If so, where is all that emergent value and who "owns" it, individually? In my opinion, we all own it because it truly is emergent. If we can structure systems which extract it equitably and distribute it via non-violent means, why wouldn't we? Today, the financial system is rigged so those who violently control others get to extract all that value for themselves. We should be experiencing trillions of dollars of deflation because of all that emergent value. Instead, we see just a bit of inflation. Why not massive hyper inflation when trillions of dollars are pumped into the system? Again, because we should be seeing deflation. Things should be getting better, cheaper, and faster for everyone.
Cryptocurrency changes everything. Where did the money come from you'll get when this post cashes out? How did a pizza go for 10,000 BTC in May of 2010 to be worth many millions today? These things are forcing us to truly understand where value comes from and once we do that we can abandon the primitive idea that work, for work's sake, is somehow good. Digging ditches and filling them in again doesn't provide value. True work has to change reality and provide value. Cryptocurrency protocols allow for us to rethink value distribution without violence. Without it, what are the alternatives?
If AI and automation end up providing value exponentially better than humans ever could, what will the future look like? I'm not one to make decisions out of fear or to prematurely optimize, but I'm also not one to bury my head in the sand of my ideology and ignore rationality and the signs of what's coming.
In my opinion, based on the many lectures and books I've read on the topic, AI will change everything and we should not ignore this. For more on that, check out The Morality of Artificial Intelligence and the Pindex board I link to there. If AI will change everything, what are we going to do about it? Won't mothers and fathers riot in the streets demanding someone (government) "do something" to help them? Won't they eventually turn to violence and theft in order to feed their family if there literally is no work available to them which can't be done exponentially cheaper and better by an AI system?
What solutions are we bringing to the table? If we care about the NAP and non-violence, how are we going to ensure wellbeing goes up and violence goes down in this situation?
To counter my own biases on this topic, I spent two years reading various articles shared in UBI Facebook groups. I wanted to better understand this "free money" idea which seemed, on the surface, so ridiculous to me. I'm glad I did that, because it helped me understand what's happening. Just like always, those in power within government are using this situation to increase their own power and control. We can and should clearly show how dangerous that is. At the same time, I don't think we do humanity any favors by ignoring the problems UBI is trying to address before they become systemic.
How much research have you done into the threat of AI and automation? What solutions do you have that don't include some aspect of a UBI?
(I hope you see this comment and feel it's worth responding to. I don't often upvote my own comments, but will in this case for visibility.)
There is an aspect of this I haven't seen many people mention. The presence in the world of a lot more useful stuff ("wealth"), whether created by robots or people, doesn't make anyone less rich. It does, however, as you pointed out, make it harder for some people to trade what they have, whether goods or services. Supply and demand, and all that. However, the reason I'm not at all scared of AI and automation is that, without "government" coercion, nothing would be stopping people from building their own homes and living off the land, and what they DO have to trade (mostly their own brain and muscle power) will buy a lot, when there is that much stuff in the world. Ironically, I think an "excess" of valuable stuff will push more people into being SELF-sufficient, which I think is a good thing. On top of that, it makes actual charity--whether just one-on-one or via some big blockchain version which might look a little bit like UBI, minus the state violence--really damn easy. That has NOTHING to do with people being ENTITLED to anything, and that entitlement mentality ("I exist, therefore the world owes me stuff"), a.k.a. communism ("to each according to his need"), will not lead to good things. However, a voluntarily arranged system of giving ("I don't want anyone being homeless or starving, so I will give some of what is mine") is already taking care of a LOT of people. And with less coercion and more wealth, that just gets easier and easier. In the U.S., for example, starvation isn't a thing anymore. Having been "the poor" myself (thanks in large part to the ruling class) for years, I remind myself how ridiculously RICH "the poor" in the U.S. are. Lastly, I would say that automating the hell out of production is still not going to "make all jobs go away"; it will just make them CHANGE a lot. As long as people still want something, anything, including non-physical things--which will be true forever--"jobs" still exist. They will just be easier, less dangerous, and less about making physical stuff.
Hey Larken! Thanks for replying.
I think we disagree here. We're already starting to see automation take jobs, and it will increase at an exponential rate. If 40-60% of the workforce will be replaced in the next handful of decades (see the research I linked to previously), and if any job they might consider retraining for is already being done via automation, how will they not become less rich? How will they earn money to eat and live?
Living off the land as a homesteader sounds great, and you describe it well in the Iron Web, but it's a fiction. To think the millions of government "educated" people in cities today could just pack up their stuff, get an axe, travel to some unused land, cut down some trees and build a home is not rational. It might be ideal, but it's not probable. The unused resources of the land can not support that many people and the people themselves are not skilled enough to survive. If the answer is "well, let them die then," I have to argue for something better.
Brain power will be primitive compared to commoditized AI. Muscle power only matters if it can provide real value. Digging ditches and filling them up again uses muscle power, but doesn't provide value.
I do think something like a UBI not backed by the violence of government could be part of the answer, and I'm glad to hear you concede that point. I hope you get a chance to read through Dan Larmier's posts on UBI which @kennyskitchen mentioned in his comment. I'm no fan of entitlement either as I've mentioned before (How to Make an Entrepreneur Mad). I do see how a UBI system could increase entitlement in some people, but as I've mentioned, after reading articles shared in UBI Facebook groups for years now, I also recognize the opposite can happen. When people are freed up from survival needs, they move up Maslow's Hierarchy and can provide much more valuable work and take risks which can benefit everyone.
I think this is the part we miss each other. Who owns the emergent wealth caused by the breakthroughs of humanity? Emergent properties are very real things and they transcend one category as they move into the next. A few molecules of H2O can't be called wet, but at some point when you add more they are. The wealth which exists and is created as an emergent property of all our activities is also real. Unfortunately, no individual can claim it, other than those who are already using the power of the state and fiat currency to siphon that value for themselves. What I'm advocating isn't a redistribution as much as a more accurate distribution in the first place of this newly created value much like STEEM is distributed to authors, commenters, and curators today so everyone can share in the emergent value of Steemit.com (already ranked 2,395 on Alexa).
This contradicts the research I've seen from the experts studying this stuff. Yes, I get that experts can be wrong such as the Malthusian Trap, but I'm not convinced they are wrong in this case. People will want something and the corporations and governments who commoditize AI and automation will be able to provide it far cheaper than humans. In that scenario, how will the jobs you describe exist? Even the non-physical things (music, art, writing, gaming) are already being created by AIs today. Soon they will be better than what we can produce because they will better understand the human mind and what it responds to.
I'm not one to be pessimistic (I'm usually accused of being overly optimistic), but I do think if we care about wellbeing we should consider this topic carefully. If we don't put something voluntary in place to deal with the potential future facing us, the government backed by violent force certainly will.
I do think there will be jobs with AI. They will simply shift into the creative, and entertainment spaces. Kind of like interacting on social media, making games for people, making movies, writing poems. I do think there will always be JOBS. Though I think the kind of jobs you look at HELP WANTED to find will become harder and harder to find. It is likely to become more and more and environment where people have to CREATE their job. How that will actually manifest and what new problems will come with it is the great unknown. It definitely should bring its own share of new problems. New things typically do. You are right that AI and Automation are coming. They are coming in a big way. Yet there are still many aspects of the human mind that we do not understand in terms of AI. We can make articles that write posts and stories yet if you see them enough they still are lacking something and they are noticeable. We still have intuition, epiphanies, "wisdom", and many other strange things that our AIs do not simulate yet. It is likely born from these elusive traits of our mind that many jobs will be created. The problem is that not everyone is good at those things. So what about those people?
It is a very interesting topic.
Thanks for the inspiration... well thanks to you and to Larken.
I was inspired to take a stab at being a futurist.
https://steemit.com/future/@dwinblood/the-future-an-optimistic-look
I'm not as confident we as human beings will be able to compete with AIs in the categories you mentioned for very long (games, music, movies, poems, writing, etc). From the research I've seen and the books I've read, these things are coming very, very quickly. Eventually, if we let AI learn exponentially as it may be capable of doing, it will know us better than we know ourselves.
Some smart people are very concerned about that, and they should be. I also think we have the potential for AI to teach us what morality really is (or should be). Also, I think a merging is coming between synthetic and biological "life".
I read a book, Age of EM (Electronic Minds) which I didn't really enjoy all that much, but it did explore some really interesting concepts such as being able to make copies of yourself. What if I had thousands of copies of me responding to Steemit posts just exactly as "I" would? How weird will things get then?
I meant to reply the following to you... but replied to myself instead... so let's try again.
Thanks for the inspiration... well thanks to you and to Larken.
I was inspired to take a stab at being a futurist.
https://steemit.com/future/@dwinblood/the-future-an-optimistic-look
Loved it! Well done. :)
Very well written. I appreciate your balanced approach and the pleasant tone with which you disagree. Upvoted. Happy to follow ya.
Thank you. I have great respect for @larkenrose and think he is one of the best free thinkers of our time. One of the challenges I have (as funny as it may sound) is to find things I disagree with him on. It's quite easy to put people whose ideas we cherish up on a pedestal and begin to think their are infallible. This, I think, leads to destructive tribalism as we all follow the leader/ruler and stop thinking for ourselves. I actively seek out things to disagree with Larken on just so I can keep myself from falling for this trap.
I liked your article and completely agree. I also read lukestrokes long response trying to make us accept that UBI can be good. I'm not buying it. Whenever "we" have tried to experiment with giving groups of people free stuff it has always made that group dependent on more free stuff and set them generations back from progress. I've read quite a bit about these types of social programs that were put into place in Detroit and look what happened... I'm not saying those that are crippled, handicapped, mentally challenged or the like should have to contribute or die. I'm saying that flipping the script from hard work and earning to no need to do either will degenerate our society. It's a disaster waiting to happen.
I'm an engineer. For my position I spend much of my time assessing technology and looking at its trends and how and when our company can benefit from it. History is filled with projections of the devastation that would be caused with the advent of the latest technologies. I could list dozens of examples, but I don't think its necessary. In the end we adapt. Technology will always replace old ways of doing things with new ways that are faster, more efficient and/or cheaper. People will adapt. Unless of course we dumb them down, give them a bunch of free stuff and tell them we will take care of them...
I think you may be comparing apples with oranges. From what I've seen, actual UBI experiments (i.e. not welfare systems based on eligibility requirements, but resources given universally to a group of people) did not show people more dependent on free stuff. On the contrary, when they no longer had to worry about basic survival, they ended up doing even more valuable, risky, and creative work. I know, it's surprising and counter-intuitive, but that's what the data from the pilot programs I've seen so far shows. Just look at all the work people do now for free within open source communities as an example. If peoples' needs are met, they can do more creative work and behavioral psychology shows us intrinsic motivation creates more creative output than extrinsic motivation. The data is out there for the pilot programs which have already been run if you want to engage with it. I thought it was BS also, which is why I joined some Facebook groups a while back to challenge my own perception.
I'm liking this reply a lot.
Blockchains change the game of economics, so thinking of economics as per the "old ways" isn't accurate anymore. Governments have less power over money than they used to, so new ideas such as a form of blockchain communism for instance might actually make sense, and actually work.
I dunno if it would, but eh, if the code is strong enough, it might.
Communism may not be the right word for it (too much historical baggage), but yes, I think the very nature of value changes as technology advances and more people move up Maslow's Hierarchy of needs because of it.
Aye, I felt weird using communism too. It does have a lot of baggage. Personally, I was hoping no one would think I am a communist, and then make fun of me for it or something. -_-
That's just how it is. Communists just don't have a good standing, even if a "commucoin" that actually worked could exist.
The real point is that the game has changed, and it's awesome to finally have a lot more options.