Over the last few years, there’s been a few small headlines questioning if value investing is still viable…
Now The Economist has published a new article on value investing that has caught Dan’s eye. Dan digs into some popular value metrics and the future of the value investor and on his opening rant.
Then Dan sits down for a conversation with Annie Duke. Annie is a former professional poker player who won over $4 million in poker before retiring in 2012. Today she is an author, corporate speaker, and consultant in the decision-making space.
Annie and Dan talk at great length about her latest book, Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts. Annie provides the listeners with a framework for making better decisions and producing more consistent investment results.
Annie says by shifting your thinking from a need for certainty to a goal of accurately assessing what you know and what you don’t, you’ll be less vulnerable to reactive emotions, knee-jerk biases, and destructive habits in your decision making… something every investor needs to master.
And finally, on this week’s mailbag, Dan answers questions about DRIP investing and entering the crypto space for the first time…
But turns out tensions are still high from last week’s episode. Today, two different listeners write it accusing Dan of being unpatriotic.
Listen to Dan’s full rebuttal and more on this week’s episode. And check out our other podcast: American Consequences with Trish Regan: https://americanconsequences.com/podcast/
Author, National Poker Champion
Annie Duke is a World Series of Poker bracelet winner, the winner of the 2004 Tournament of Champions and the only woman to win the NBC National Poker Heads Up Championship. She has authored four books on poker and in 2018 released her first book for general audiences called "Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts," which is a national bestseller.
1:40 – Over the last few years, there’s been a few small headlines questioning if value investing is still viable… Now The Economist has published a new article questioning if value investing still worth practicing…
6:30 – The headline grabs readers, but inside the article… “They point out value investing’s rigor and skepticism are as relevant as ever especially given how frothy markets look…”
14:24 – Dan shares some troubling research from Hoisington Investment Management Company’s quarterly review. “These guys have been spot-on, calling it right when a lot of other people were getting it wrong…”
21:47 – Dan shares his new quote of the week, from Naval Ravinkant on the Joe Rogan Experience episode#1309… “The human brain is not designed to absorb all of the world’s breaking news 24/7…”
24:20 – This week, Dan invites Annie Duke onto the show for an interview. Annie is a former professional poker player who won over $4 million in poker before retiring in 2012. Today she is an author, corporate speaker, and consultant in the decision-making space.
27:12 – Dan asks Annie about how to internalize some of the big ideas from her new book, Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts.
34:27 – “Colloquially when we say mistake, what we mean is we got an outcome we didn’t like, as opposed to the decision making was poor.”
38:20 – Annie on assessing past decisions… “What I really care about when assessing a decision is, what was my state of knowledge at the time when I made the decision, cause that’s actually gonna help me figure out whether the decision was good or not… because I can start to see ‘were there gaps in my knowledge where I actually should have known that?'”
42:19 – Annie says it’s incredibly important that before you invest, be explicit about what knowledge you’re using to base your decision.
48:55 – Annie advocates making a detailed record of all your decisions. “If you haven’t been recording your decisions in any kind of way… it becomes a tough problem to try to figure this stuff out.” Because it’s hard to honestly assess decisions in retrospect
54:30 – Dan points out that most people aren’t willing to put in all the work that Annie is talking about because the payoff shows up later. “Most people they make their decisions quickly, and with less forethought than they want to…”
1:03:30 – Annie leaves the listeners with one final thought before the interview closes, “Your decisions are ever only going to be as good as your beliefs, because your beliefs inform all the decision that you’ll make… it’s really good to get the perspectives of other people.”
1:07:31 – On the mailbag this week, one listener calls in and asks questions about Dividend Reinvestment Programs. Another listener asks about how to best get started with cryptocurrency… But the big story is two listeners call in and accuse Dan of not being patriotic enough. Listen to Dan’s response on this week’s episode
Announcer: Broadcasting from the Investor Hour Studios and all around the world, you're listening to the Stansberry Investor Hour. [Music plays] Tune in each Thursday in iTunes, Google Play and everywhere you find podcasts for the latest episodes of the Stansberry Investor Hour. Sign up for the free show archive at investorhour.com. Here's your host, Dan Ferris.
Dan Ferris: Hello and welcome to the Stansberry Investor Hour. I'm your host, Dan Ferris. I’m also the editor of Extreme Value published by Stansberry Research. Before we get into today's episode, don't forget, Trish Regan is now a part of the Stansberry family. Check out her podcast, American Consequences with Trish Regan. The link will be in the description of this episode. As for today, we'll talk with Annie Duke, the champion poker player and author. She has a brand-new book out called, How to Decide. We'll talk about that today. This week in the mailbag, not one but two listeners suggest I'm not patriotic enough. Plus, questions about DRIPs and bitcoin.
In my opening rant this week, The Economist signals on value investing. It's the Economist signal, man. We'll get into that, and I'll also talk about some other research I've looked at this week. That and more right now on the Stansberry Investor Hour. "Does value investing still work?" The Economist asks in their November 14 edition. And the reason I noticed that – I've seen so many of these articles about value investing, probably a couple dozen over the last couple of years. So they're becoming quite commonplace because value has performed poorly over the last decade as I keep reminding you, as people keep reminding me. But this time, value is – it's not on the cover of The Economist.
But it's one of the top articles in the latest issue of The Economist. And value investing is so unimportant to so many people that it's never going to make it on the cover. So this is going to be as close as we're going to get. And just to give you a little background, The Economist is one of those publications that has a reputation for the... especially the cover stories, being good as contrarian indicators. And so, there's an article actually from 2016, two analysts from Citigroup did some research, and they looked at covers of The Economist.
And The Economist wrote an article about this. So The Economist is writing about itself as a contrarian indicator. And these two analysts from Citi, they looked at 44 cover images between 1998 and 2016. The one that you probably know the best was the drowning in oil cover from February, 1999, which proceeded a pretty healthy... a huge commodity bull market and, of course, an oil bull market. There was also November 2009, Brazil takes off," it said on the cover about the Brazilian stock market. So The Economist points out... you know, the story wasn't even about the Brazilian stock market at all.
So, you take this stuff with a grain of sand, according to them. And indeed, the analysis that the Citi guys did find that after 180 days, only about 53.3% of The Economist cover is qualified as good, contrarian indicators. It's like tossing a coin, right? 50%. But after 360 days, after a whole year, the signal is a lot more reliable. 68.2% are contrarian. And it says, "Buying the asset if the cover is very bearish results in an 18% return over the following year – shorting the asset when the cover is bullish generates a return of 7.5%." It sounds like you could have like an Economist contrarian fund that just, you know, goes long and short the cover of The Economist accordingly.
So that's why I'm looking at this article called, "Does value investing still work," and saying, "OK, value is about to start working again." And they point to two ideas. They say that influence – you know, the past decade and/or the reasons why maybe value doesn’t work anymore. And the first they said is the rise of intangible assets, which now account for one-third of all American business investment. And I've talked about this before. All that means, really, is perhaps that book value isn't a great measure of a business's real intrinsic value, right?
Because a lot of this stuff gets expensed. You know? The accounting rules are such that lots of investments in research and – that really would qualify as capital investments in research and development, they get expensed. So they don't wind up in the balance sheet. They don't wind up in the book value, and they're not measured in price-to-book value. So I buy that. I buy the book value maybe isn't working so great anymore. But I don't buy the fact that just because lots of firms are buying intangible assets that worrying about how much you pay for the cash flows you're getting doesn't matter anymore. You know? It just doesn't make any sense.
And the second change they say that happened in the past decade or two or whatever... the second change they say is the rising importance of externalities. Costs that firms are responsible for but avoid paying. And they say, for example... like, today the value doctrines suggest you should load up on car firms and oil producers, automobile-producing companies and oil-producing companies. But hey, say these firms' prospects, depending on the potential liability from their carbon footprint... the cost of which may rise as emissions rules tighten and carbon taxes spread. I think I'm less worried about carbon footprint in this respect than most people. I think that plenty of really fine, profitable automobile producers will just change with the time.
And if they have to make electric cars, they'll make electric cars. And as far as oil production goes, I think we're going to be needing oil longer than maybe these folks at The Economist think. Let's just leave it at that. OK? So, they do point out that value investing's rigor and skepticism are as relevant as ever – especially given how frothy markets look. OK? So if you read the article, they're reasonable about it. But just the fact that you can attract readers with a headline, "Does value investing still work" – that's really the contrarian indicator right there. You almost don't even need the article.
So let me go from there to some research I was reading this week which falls very hard upon the conclusion of The Economist: that the rigor of value investing is important. And it's John Hussman's latest market comment. You know? We always check in with John Hussman. And I wish he would come on the program and do an interview with us. I've invited him on via Twitter many times, but I just don't think he's interested. Some people are like that. I respect it. It's fine. But we're going to keep citing his research because I think it's the best work on the valuation of the overall stock market.
And he's got a great picture, a great graphic image, at the top of his latest market comment. It's a volume control like from an electric guitar amplifier or something. And it goes up to 11. And it's turned up to 11, right? That idea of turning up to 11 is from a parody film called This Is Spinal Tap. It's a parody of an English rock band. And they painted the number 11 on their amplifiers and say, "I'll just go to 11." And it's a mock documentary. So the interviewer in the documentary keeps saying, "But that's just the highest setting, and you just painted that on there." The guy said, "No. No. I'll just go to 11."
So this market goes to 11, and it's on 11, right? Hussman tracks these five measures which are all kind of price-to-sales, price-to-earnings oriented. And they're higher than they've ever been – including the peak of the dot-com bubble in 2000 and the peak of the 1929 market. More expensive than ever. The market is more expensive than ever by those five measures. And there's another one I can cite. I think there are six or seven of them in total. And the other one I can cite is the Graham and Buffett indicator. The total market-cap of all the stocks compared to the GDP of the United States. And that is closing in on 170%. It's like 166% or 167% the last time I saw it just days ago.
So, you know, it's more expensive than ever. It's never been that high before – including the dot com and '29 peaks. There are at least six measures that I know of that say, "This is the most expensive market any way you look at it." But Hussman has a terrific quote that really boils down – for me, it boils down what value investing is all about. And it's by Jeremy Grantham from a CNBC interview on November 12. And Grantham from GMO – these people studied bubbles at GMO. They manage money. We've had James Montier from GMO on the program. I'd love to have Jeremy Grantham on sometime, if he'd do it.
But Grantham calls, says, "This is the third real McCoy bubble of recent decades. What we're in right now." November 2020, this is the third real McCoy asset bubble of recent decades." And on CNBC, he said, quote, "The one reality that you can never change is that a higher-priced asset will produce a lower return than a lower-priced asset. You can't have your cake and eat it. You can enjoy it now, or you can enjoy it steadily in the distant future. But not both. And the price we pay for having this market go higher and higher is a lower 10-year return from the peak." End quote.
And Hussman has done the exact same thing. He said the exact same thing different ways with different data. You know, he has estimated 12-year annual total return that he does on what he calls a conventional 60-30-10 mix. 60% S&P 500 stocks, 30% Treasury bonds, 10% T-bills by the short-bonds. And the estimated total return based on the current valuation – right, the current price versus the cash flows that you're going to get out of those three assets – the 10-year return is like 1.56%. Negative. I'm sorry, negative 1.56%. It's negative. And that always happens with these estimates.
And, you know, even GMO, they do these seven-year estimates. And they're estimating really low returns and negative returns in some stocks. It's basic arithmetic, right? You pay $10 for an asset, you get $1 a year in cash flows. That's a 10% return. You know? You pay $100 for the same thing. It's the same cash flow. But now, it's a 1% return. So you must be careful. You must be careful about how much you pay. Now look. I know that the cash flows from stocks are not something that you can predict so great – especially those longer-term, farther out ones.
And you can buy a huge basket of stocks and a small core of them are probably going to grow like crazy over the next 10, 20 years and produce, you know, most of your return. And you wouldn't have been able to predict like which 50 out of 500 or 20 or 30 out of 500 were going to go up 50 or 100 times and make you all your money. So I understand that you can't – you know, it's not as mechanical as it is with bonds, right? You buy a bond, they say, "We're going to pay you this much in interest every year, and then you're going to get your principle back in this year."
So you know exactly what you're getting. You know exactly what kind of yield you're getting, depending on whatever price you're paying at this moment. Whereas, you know, with stocks it's not as easy to figure that out. In fact, you can't truly figure it out. Can you? Because you can't know the future. However, when stocks are more expensive than they have ever been in history, you got to kind of scratch your head and go, "Hmm. Maybe this is riskier and lower return than it's ever been in history. You know? By all the measures that make any sense." Including like price-to-sales is one of Hussman's five metrics.
And it's an easy one. The other ones that he uses are more complex. But price-to-sales is easy. And at 2.6 times sales, the S&P 500 is more expensive than it has ever been. I think the peak in 2000 was – March of 2000, dot-com peak – was like 2.3 or 2.4. So we're above that now. And doesn't mean the market's going to crash tomorrow, but it just means it represents a lot more risk than it did at lower prices, and it represents a low return. That's all I’m saying here. And I'm saying it over and over again because it keeps being like it is right now... the worst ever. The highest ever. The most expensive ever. I can't turn away from that. Most of the time, I don't even care about this stuff, but I really cannot turn away from it. OK?
The other thing I was reading this week was Hoisington Investment Management Company. They're based in Texas, and the guys there – Van Hoisington and Lacy Hunt – are just really two of the smarter investors in the market over the past 40 years. And they recently put out their third quarter 2020 quarterly review and outlook. They put out one of these every quarter. I highly recommend going to hoisingtonmgt.com and checking these out. Because these guys have been spot-on, calling it right when a lot of other people were getting it wrong – especially like the past 10 years.
The typical wrong call in '09, 2010 – you know, before gold peaked and crashed – was, "Oh, we're going to have terrible inflation because of all the quantitative easing after the financial crisis." But the Hoisington folks, Lacy Hunt especially, came out and said, "No. That's not how this works." We've covered this as we've interviewed some of our macro investor guests, and it's a simple proposition. When you do quantitative easing, you print money but you buy bonds, right? So you take bonds out of the market and replace them with something that doesn't yield anything... just $1 of bank reserves is all it is, right?
And that's deflationary. But these guys, the Hoisington guys, they've been all over this for decades. And in their latest quarterly review – it's the 40th anniversary of their firm, so they look back 40 years. And their current analytical review contains five points that I think are important. So let me just read them to you now.
2. The U.S. is caught in a deathtrap... a condition where too much debt weakens growth, which elicits a policy response that creates more debt that results in even more disappointing business conditions.
Number two and three, two ways of saying the more and more debt we pile on right now, the worse we're making things and the slower the economy will grow.
And the reason I think why you can still hold something like gold and silver even though I just read all this stuff that says... has disinflation and economic deterioration written all over it, is, the response. The response at some point from the Federal Reserve and all central banks around the world will get to be too much. They'll print too much money, and they'll figure out a way to overcome that hurdle that Hoisington has been talking about where the money just sits there in reserves, and it doesn't get lent, and it doesn't get spent. They'll figure out how to get around that. They'll print too much, and it will push up asset prices, and it will push up wages and all the other things that tell you inflation is really happening... eventually.
And believe me. When it really starts happening, you won't want to start buying gold then. You want to have it before then. You'll want to have Bitcoin before then. So I think this is really important, and I listen to these guys because I think they've proven that they are in-tune with the historical trend. And you could say, "Well, Dan, you're telling me you're going to make the same mistake." Right? "You're going to make the same mistake that they made 10 years ago. You're going to predict inflation even though these guys have predicted disinflation."
What I'm telling you is that we're way farther – having printed $3 trillion new dollars, right... almost as much in all the years of quantitative easing after the financial crisis, just in one year we printed over $3 trillion new dollars. We're much farther down that road. We're effectively more than just 10 years down that road from 2010. Or you could cite it from the bottom of the crisis in 2009. 11 years. We're effectively more than 11 years down that road. We're effectively like 20 years down that road. Time is a different thing with investments. It's an accordion. Time becomes an accordion. It stretches out and compacts into a smaller space. We compacted a lot of time into a very small space this year. You should think of it that way.
And there's a really great book by Beñoit Mandelbrot called The Misbehavior of Markets that contains this discussion about time. So yeah. I was reading Hoisington. I was reading Hussman this week. And I'll finish up on a positive note for value investing again, how I started out by pointing you to mindsetcapital.com. That's Aaron Edelheit's firm and his website. And he's got a new presentation there. It's called, "Value Investing is Alive and Well," because he's a value investor. He's finding all kinds of good deals.
And he has stock ideas in that presentation. And you might want to go have a look at it and maybe hear about value investing from a real practitioner who's done very well with it over time. He's doing it right now and getting good returns and finding good deals. And, you know, he'll include some of those in the presentation for you too. So I think that's all I'm going to say about this, right? Value is still getting a bad rap even though I think it's really... as time goes by, it becomes more and more important to allocate to value. And the market is extremely overvalued.
So do be mindful of value when you – especially when you buy equities nowadays. Actually, and bonds too. Stocks and bonds. Very, very expensive. And read that Hoisington piece and see if you come away with a similar view to what I've expressed. And now what I'd like to do, my quote of the week, the second week. OK? We started this out last week with a quote from my old friend Chris Mayer. And I'm going to do one this week from a guy named Naval Ravikant. He is getting to be a more well-known name in the world. He's an entrepreneur. And he's also just a very wise, insightful fellow. He does a lot of reading. He comments on a lot of different trends and areas.
And he's a very – he's mostly a scientifically oriented guy. But he comments a lot on the culture that we live in too. And the quote comes from The Joe Rogan Experience podcast number 1309 on June 4th, 2019. And they got to talking about social media and just the way that we are bombarded with information nowadays. And I thought Ravikant just was brilliant on this topic. So I'll read just a little bit of what he said here. Quote, "The human brain is not designed to absorb all of the world's breaking news 24/7. With emergencies injected straight into your skull with click-bait headlines and news.
If you pay attention to that stuff, even if you're well-meaning, even if you are sound of mind and body, it will eventually drive you insane. Twitter, Facebook, Instagram, these are weaponized. You have social statisticians and scientists and researchers and people in lab coats, literally the best minds of our generation, figuring out how to addict you to the news. And if you fall for it, if you get addicted, your brain will get destroyed." End quote. I found that incredibly powerful simply because he was willing to not mince words. "Your brain will get destroyed." He's not on the fence here. He's very clear about what he thinks what will happen.
He also commented on politics in a similar way and used similar words. He said, "You know, if you're always paying attention to politics, it'll destroy your ability to think." I think that's brilliant. I'm going to let it stand on its own. Maybe send us a little feedback, see what you think if you have a similar or different view at [email protected] For now, let's talk with Annie Duke. Let's do that right now.
[Music plays and stops]
My listeners, now, they need to secure their savings for the future. And after working with my friend and publisher, Porter Stansberry, for nearly two decades, I've seen him make one incredible investment call after another. Over the years.
So that's why I wanted to recommend Porter's one critical move you must make with your money. You can get his full take on the subject by visiting www.newamericancurrency.com. Don't miss out.
[Music plays and stops]
Today's guest is Annie Duke. Annie Duke is an author, corporate speaker and consultant in the decision-making space. Annie's book, Thinking and Bets, Making Smarter Decisions When You Don't Have All the Facts is a national best-seller. As a former professional poker player, Annie won more than $4 million in tournament poker before retiring from the game in 2012. Nice.
Prior to becoming a professional player, Annie was awarded a National Science Foundation Fellowship to study cognitive psychology at the University of Pennsylvania. Annie is the co-founder of the Alliance for Decision Education, a nonprofit whose mission is to improve lives by empowering students through decision skills education. She's also a member of the National Board of After-School All-Stars and the board of the directors of the Franklin Institute. In 2020, she joined the board of the Renew Democracy Initiative. Annie Duke, welcome back to the program.
Annie Duke: Thanks for having me back.
Dan Ferris: Oh, you bet. I wouldn't have missed the chance. I mean, a new book from Annie Duke is like, "Wow. This is an event."
Annie Duke: Well, it was an event for me anyway.
Dan Ferris: Yeah. That's right. And for me too. So, you know, I said, "Wow. I can't wait to get this book." And I like pre-ordered or whatever. And it finally showed up. And I was like, "Ah, I’m going to tear into this." It's only like 200 – it's not even 300 pages... like 250 pages or something. And then, I get into it, Annie, and you're giving me homework on every other page.
Annie Duke: So yeah. There's an interactive element to the book. And you can engage with those elements kind of as much or as little as you want to. So you can just read it and kind of look at what the exercises are trying to get you to think about, or you could actually engage with the exercises. I mean, obviously I feel like it's better if you actually engage with them. But you can get a lot out of the book without that because it's really going through the concepts. But the idea is that there are certain things that I think are really helpful to kind of understand in terms of the way that your decision-making can go wrong... which is some of what those thought experiments are trying to get at. And then, I'm offering really concrete tools for how to make all of this stuff better. So you could either choose to do that while you're reading the book, or you can just know that those tools are available to you, that will actually help improve your decision-making.
Dan Ferris: Yeah. I sounded like I was complaining, but I'm really – I was just joking. But it seemed to me as I look through... I'm biased here because I read a lot of books about learning. I read Josh Waitzkin, The Art of Learning, and Benedict Harry, How We Learn, and a couple others. And it seemed to me like between your last book and this one – you're still writing about decision-making, but you're also wanting to... you want your reader to learn it better. You know? It's not a – it seemed like... I got the impression it's just not enough to read about it. You must think and write about it. When you write about things, you learn them better. Don't you?
Annie Duke: Yeah. I think that's actually one of the reasons why I wrote the book. So thinking of bets was really trying to get people to start thinking about the problems of uncertainty and how that can, you know, mess with your decision-making. And by uncertainty, I mean two specific things... that luck has a really big influence on the way that your life turns out but also that, you know, when we're making decisions we don't have a ton of information... certainly not in relation to the, you know – if we were omniscient, what could be known versus what we do know when we make decisions. You know, it's just limited.
And I think that, you know, nothing more – coronavirus certainly shows us that, right? Like, we're trying to make really serious decisions about managing a risk, and we know very, very little about the virus. So, you know, I think that that really brings that issue to the fore. So I was trying to sort of get that point across in thinking of bets. And, you know, what a lot of people asked for was, "OK. So given that there's luck and we don't know very much – there's a lot of hidden information – how would I actually make really good decisions given that? How would I actually think about how to become a better learner from experience? How do I think about how to construct better forecasts of the future?"
Which is really kind of what a decision is, is a prediction of the future. "How can I figure out when I can go fast or slow or how I can better extract information from the world?" And I think these are all very practical questions – not ones that were covered very well in thinking back to a lot of the literature. And I wanted to write a book that was really practical... where someone could read it and feel like they came out the other side actually knowing that they can be a better decision-maker
Dan Ferris: Yeah. Feeling it. Knowing it with some conviction. I agree. That's really cool. So the topics are similar. Like, we – you know, you talk for example about resulting, right? This idea that looking at the result of your decision is how you determine if it's a good or bad decision. That's not really – that's not really true in real life, right? And other similar topics that you discuss in thinking in bets. Is there anything really brand-new here? Like, what's the state-of-the-art in decision-making?
Annie Duke: So actually, the majority of the book is pretty brand new to thinking in bets. So I start in the same place because you kind of have to... which is this issue of we learn from experience, obviously. Like, how else would we learn? You do stuff, and then you find out how it works out. And then, sort of decide whether you want to do that again or do something different or... that's kind of how we become – we should be becoming better decision-makers.
But because of things like resulting, which you just talked about, there's all sorts of ways in which that doesn’t actually proceed in a very orderly fashion. We're not really good at extracting lessons from the outcomes in our lives. And that's, to your point, what you said. It's because they're not connected very well together. Because you can go through a red light, and you cannot get in an accident, and you can go through a green light and you can get into one. So it's hard to know, like, if it went well or not, exactly why that is. So that's where the book starts, because it kind of has to.
And it's grounded in some exercises. But then, it takes a pretty big, sharp left, and it start talking about, "How do you kind of build out very, you know, relatively easy-to-understand decision trees? How do you actually start thinking about probability? You know, what things are likely to happen if I make a particular decision? How likely are those things to happen? You know, what's the pay-off for any decision that I make?" Which is really the way that you would actually walk through making a really good decision.
And part of that, you have to start thinking about your past experience as, "What if something else had occurred?" Right? So if I go through a green light and I get in an accident, what if I hadn't gotten in an accident? Like, how would that have happened?" So that's sort of the center of the book, is really talking about, "How do you actually think about new decisions and constructing these decision trees?" Then it gets into some stuff about like, "How do you actually figure out what knowledge is really important in order to inform your decisions better?"
So it gets into some things about inside and outside view. Inside view is like the way that we think about the world through our own perspective... which is generally where all the cognitive bias. And then, the outside view is kind of what's true in the world in general or how other people would view the situation. Then it actually moves into how to speed your decision-making up. How to actually be a faster decision-maker... which is obviously very different than anything that was in the previous book. And then, the rest of it is the power of negative thinking. And then finally, how to actually start interacting with other people so that you can create really good team decision-making.... which is actually quite a big problem.
Dan Ferris: Right. I'm actually really glad that you started out with resulting because it's a great topic. Isn't it? It's completely counterintuitive to look at a good outcome of a decision and call it a bad decision. And as investors, which is mostly what our audience is, they want to focus almost completely on their results. It's just so hard not to. And you can see it in the marketing for hedge funds and mutual funds and other products where they tell you, "Hey. Over the past three years, we've beat every index," or something. And of course, that's usually an indication they're going to underperform in the next three years.
Annie Duke: Yeah. Yeah.
Dan Ferris: So where – the real question then become, "What's the role for assessing outcomes?" Like, that seems to be like the really – there's two hard parts. "How do we assess the outcome, and then how do we know we made a good or bad decision if the immediate outcome, you know, on one or two occasions is not of primary importance?" This is a big deal, I think.
Annie Duke: It is a big deal. So let me think about it as a retrospective problem and then... meaning you made a decision, you already got an outcome. "How do you figure it out?" And then, I think it'll roll back into your sort of as you're making new decisions what you need to be doing. So if we think about the retrospective problem, it's obviously... outcomes in the aggregate – like, if we have enough data – are quite informative. So if I can flip a coin 1,000 times, I can tell you lots and lots about the coin. For example, whether it's fair.
But the problem in terms of the way that our mind works is that we don't really wait for decisions in the aggregate to work out. Like, we don't wait to have a very big data, you know... huge data set before we make any decisions. and we tend to be processing those outcomes in sequence... in other words, one at a time. And that's where we can really get into really big difficulties. And there's certain types of investing that you can do, as you know, where it's hard to wait for a lot of data in the aggregate because you may not actually be investing in too many things, and they may have very long feedback loops. So this sort of becomes a problem if we're going to – and the other thing that becomes a problem because of the resulting thing is that colloquially when we say mistake, what we mean is we got an outcome we didn't like.
Dan Ferris: Yeah.
Annie Duke: As opposed to, "The decision was poor." Right? So if you invest in a stock and it goes down, you'll tend to say, "I made a mistake." Which of course is – it shouldn't even be a sentence of English because I have no idea if you made a mistake just because it went down, right? There's a whole bunch of other things that I would need to know. So essentially, the way to address it is to basically try to do two things... one is reconstruct your state of knowledge at the time of the decision. And then, the other is reconstruct what, like, a decision tree would've generally looked like.
So let's take the first one. So I've got this tool in the book which is called, "A knowledge tracker." So basically what you would say is, "What did I know at the time of the decision?" So there's certain things that you can't know. So one of the things that becomes really obvious when you think about, "What did I know at the time of the decision," is that you did not know the way it would turn out. So just right there, that's actually really helpful to say, "Let me write down, "What are the things that I knew at the time of the decision?"
Because what is not contained in there... that the stock went up or down, for example. So now, at least you can get some separation from that. But it's going to be inputs. It's going to be like, you know – could be, "What are the unit economics," or, "You know, what are the earnings protections? What do I think" – and then, you can sort of look at... so you can look at, "What is the data? What are the things that I knew that went into the decision that were facts, and then what was I thinking – what were my beliefs about those facts?" Right?
So it could be that the company is predicting they're going to make 100 widgets over the next year, but I actually think that that's an underestimate, and I think they're going to be able to produce 120 widgets over the next year. And I think I know something different than what the market does. So you're basically just thinking about, "What were the inputs into the decision?" Then you look at, "What are the facts that revealed themselves afterwards?"
So there'll be certain facts like you'll know how many widgets it actually produced, as an example, right? You'll know something about whether it went up or down. You may know – there may be an acquisition that occurred or whatever it might be. So you say, "OK. So here are the things that I knew before-the-fact. Here are the things that I knew after-the-fact. And now, let me look at things that I knew after-the-fact and ask myself, "Were any of these things actually knowable to me beforehand?" For most of the stuff that reveals itself after-the-fact, the answer to that is going to be no.
For the things that you say "yes," you would say, "OK. It wasn't knowable to me beforehand, but could I afford it?" In other words, like, "Did I have enough time to go find it out? Was it reasonable for me to understand that this was actually an important thing for me to know at the time?" Because a lot of times, the answer to that is going to be "no" – that that revealed itself to you later. It could be that it cost too much money to get the information. There could be a variety of things. But let's say that you said, "Yes. I could've known about it before-the-fact, and I didn't go find it out."
Then it's not about beating yourself up. It's saying, "OK. Let me make sure now that I wrap that into any future decision that I might make." If something revealed itself that was knowable that you didn't realize that you needed to know but now you do, you would wrap that into your decision-making going forward. And then for the whole set of things that weren't knowable beforehand, you can – generally they're not going to be knowable for the next decision that you make. So you would recognize that, and you would just say, "I'm not going to worry too much about that category."
So basically, what this allows you to do is, through this kind of knowledge-tracking, is to say, "What I really care about when I'm assessing decision is, "What was my state of knowledge at the time that I made the decision?'" Because that's going to actually help me figure out whether the decision was good or not. Because I can start to see like, "Were there gaps in my knowledge where I actually should've known that?" And then, you can do a second thing. And you can say – you can say, "All right.
So given what I knew at the time," you can sort of reconstruct a tree. And you can say, "What did I think the chances of different outcomes were?" And then you can look and say, "Was I a pretty good predictor of what that set was?" Now, the thing that I'll say about this is that you can go and reconstruct all of that if you want to. But obviously, the better thing to do is to do that work before you make the investment in the first place and actually record, "What are the facts that are going into my decision? What are the assumptions that I’m making about it? How am I modeling the data that I'm looking at? What does that mean for what I predict the probability of different outcomes occurring is?"
And if I believe that I know something more than the market does here, let me actually be very specific and not leave that as implicit and say, "I'm going to implicitly state, "Why do I think that the market has this underpriced or overpriced?" Because that's why you would invest. You either think it's underpriced or overpriced. Otherwise, you'd be indexing, right? So I’m assuming you're not indexing. "And let me actually explicitly state what I think the market has wrong." And now, that allows you to actually get a better look back because you can now compare your expectations to how the world actually unfolded.
Dan Ferris: The impression I get from this process... it's so rational – it's so reasonable and rational. And I hate to say this, Annie, but I think that's the reason why more people don't do this. Because people are so emotional about these things. And one of the things that doesn't surprise me here is that because of that, I think... it looks to me like every chapter of the book has a checklist. And we know checklists are great tools for that exact problem that I just named, right? It's, "You want to get your emotions out of the game as much as possible and make the best possible decision. So you go through a checklist to make yourself think." But I'm sorry. I cut you off. You were going to – you were going to respond to my...
Annie Duke: Yeah. No. I was kind of cutting you off, to be fair. So I'm going to... I wanted to let you finish. So yes. That's exactly right. I mean, you know, I think that part of the problem is that, in a lot of ways, I think this feels like work, right? But the thing that I really try to point out in the book is that in order to make a decision, this work should be a good quality decision. This work should be happening anyway.
And the reason is that when we think about cognitive bias and we know that that has a very bad effect on our decisions – and that's separate apart from noise, which is what Kahneman and... he's got a book coming out with Cass Sunstein and Olivier Sibony in the spring, which is about a totally different problem besides cognitive bias... which is just that – catch me at a different time, and I'll make a different judgment, given the exact same information, right? And that obviously is a problem as well. So we've got these kinds of dual issues which are cognitive bias and noise.
And a lot of what's happening with that – the reason why your mind is kind of such a good playground for that stuff – is that we allow these decisions to remain pretty implicit. So the thing that I said about – if you're investing and you're not indexing, you – when you invest, what you're saying is, "I know something that the market doesn’t know." Obviously by definition, right? Otherwise, I would just put my money in Vanguard, in a Vanguard fund. Like, why would I do anything else? But if I believe that I know something different than what the market knows, I should have to make that explicit.
Because the problem is that we can fool ourselves into believing a whole bunch of different things. We can fool ourselves into believing we're smarter than the market or that we know something different. We can be very overconfident in our intuition and our decisions in that way. So when we actually follow this type of process... the first thing that's really important is that by actually being really explicit about, "What is the knowledge that I have that's going into this decision? What are the things that I could find out that would improve my judgement? What are the things I'm actually judging?" Right?
Like, I have to make explicit what it is that I think I know that's different than the market, right? Which means that I have to be explicit about, "What are the things that are important to this investment?" Whether it's from thinking about, for example, investing in a company as a venture capitalist or something – you know, I need to know like, "What's the market size? Is it crowded?" You know, those kinds of things... that I actually have to think about those things in a very explicit way because those are the things that I’m judging.
So just going through that process of, "Exactly what am I judging? And let me not leave this as an implicit process but let me actually explicitly state what these things are," are going to improve your judgement and the time that you make the decision in a way that's going to help crowd out a little bit of that bias and noise. Not all of it but a little. But the other thing that that does for you is that in doing so, you now have a record of what your expectations of the world are so that as you start to see what those outcomes are and you don't have a really big data set, you can look back and you can say, "Did the world unfold in a way that I did not expect?"
And then, you can look at those things that were kind of surprises in there, and that's where the learning is going to sit. Because then you can start asking about, "That was a surprise. Why was I surprised by that? Was this just something that was a tail event? Was it something that I didn't foresee but I could've foreseen? Is it something that I just couldn’t have foreseen at all, and there's a reason why I didn't take it into account and I got surprised by it?" I'm sure a lot of people feel that way about the pandemic, right? And this works, by the way – let me just say – on the good side or the bad side. If I invested in Zoom 18 months ago and I'm walking around today saying that I'm the most brilliant investor ever, it's helpful for me to go back and see what I wrote down about why I was investing. Because I'm guessing pandemic and everybody has to work from home wasn't on my list.
Dan Ferris: Right. [Laughs] Very true.
Annie Duke: Right? So this helps us on those thoughts, right? It stops you from that weird thing of like taking too much credit for the good stuff – which we do sometimes, which I’m sure there's a lot of people who own a bunch of Zoom, right? They're all going around... and I have no doubt that there was going to be a shift to a little bit more working from home but certainly not in the way that we saw. And it also helps you understand when things go against you and when you realize the downside outcomes as well in terms of understanding like, "Did I forget this? Was it in my set? Was it wrapped into my decision-making process or not?"
Dan Ferris: Right. So you work with corporations and with individuals, I assume, on decision education. And I'm wondering what your experiences in particular... Do you find when you ask people to assess a decision, do you find that you look at their assessment about what they could've known and couldn't have known, and do you ever look at them and say, "Could you really have known that? Did you really know that?"
Because certainly as a guy who publishes financial research and does podcasts and stuff, I know there's. A lot of folks who – they want you to have certainty about the future. And there's an underlying belief, then, that there's some way to know the future. Which, of course, there is not. The future unfolds as it unfolds, and nobody knows the future. I'm just wondering. What's your experience with people? Do you have to get them to sort of go at this two or three times to really be honest about what they knew and what they didn't know?
Annie Duke: The sort of answer is yes. And I just want to say that none of this stuff is malicious. I mean, it is sometimes. But, like, very rarely are we talking about something that's malicious. The person that people are mostly trying to fool is themselves.
Dan Ferris: Yeah.
Annie Duke: Because we all want to feel like, "Oh, I knew that at the time." You know? So the answer is yes. I mean, it really helps to have somebody who's facilitating these types of conversations so that they can say exactly as you just put it, "Did you really know that beforehand? Like, let's actually go through it." And it's one of the reasons why I say you should actually be doing this work as you're making the decision. Because then it's written down. And you can't go back and say, "I knew it at the time," because I can look.
And I can say, "Well, OK. But then how come this is what I see you wrote down? Because you kept it a big secret if you knew it at the time." But the thing that I will say is that just going through the process of trying to reconstruct what your knowledge was at the time is going to be helpful. Are you going to be perfect at it in retrospect? Absolutely not. Is there going to be stuff that you found out afterwards that is going to creep into your memory of what you knew before? Sure. But there's going to be less of it. And less of it is better. And again. Just to really beat a dead horse here, do it beforehand. You know?
And I think that's a lot of what ends up happening. When I'm working with individuals, they're – a lot of times, the reason they're bringing me in is because something is going wrong. I mean, people generally don't call me when things are going right. Although, it'd be good if they did because I could help them to make sure that things stayed right, right? So it's generally that I get the call when maybe things aren't going so well, and they have specific issues that they're trying to address. And as we're trying to go through this so that I can understand the past decision-making and what went into it and what the processes were, you know, a lot of it is me saying, "OK. Can I see the records?"
Right? Like, "Show me the data. Show me the input." You know? And they realize at that point what a kind of disaster it is that they don't have that to produce... that as you're trying to figure out, "What are the things that we did wrong or right in the past," that if you have no records, if you haven't been recording your decisions in any kind of way, that it's just – it becomes a tough problem to try to figure this stuff out. So I think it's true that discovery process that they start to realize like, "We need to guild more structure into this." Because to your point exactly, it is hard to do it in retrospect.
Dan Ferris: You know why I love this book and I love what you're telling us? Because it is actually – it's really the scientific method in action. It's a process of... to a certain extent. To a certain extent, there's a process of conjecture and refutation. "Here's what I know. Here's what I don't know. This is my decision. Let's see how it turns out." And then, reality will deliver an outcome, and you will have to assess that. and it may refute your decision-making process. It may not. And I just think that's such a good thing. Period. I just think that is such a wonderful thing.
Annie Duke: Oh. Yeah. I appreciate that. And I think that, to that point – you know, when I think about, "Why aren't people being a little bit more careful in the way that they're thinking about their decisions," I think to you point it's like... what you just described, I think that everybody would say, "Oh, that would be a good thing." Right? Like, "I don't think anybody would refute what you just said... that that would be a good thing for their decision-making.
But it's that last little bit that you talked about that I think is at heart. Like, "So then, the world's going to tell you, 'Hey. Did you kind of have the prediction right or wrong?' And then, you should take your learning from that." And I think that that's where the hang-up in the process is. Because while everybody understands that in the long run that would be really good for their decisions... like, we can think about it, you know, with a simple example, right? If I believe that the sun revolves around the Earth, it's quite good for me to discover that that's not true – particularly if I’m trying to build a rocket somewhere, you know, or something like that. Like, there's reasons that – there are types of decisions where it'd be very helpful... I mean, obviously for somebody who – for a lot of people, it matters little whether that's true or not.
But you could think about that for things in your life, right? If I believe – if I have a particular way that I model data and it turns out that I could be modeling it slightly different and capture 5% more of outcomes than I actually am, obviously I'd be way better off if I knew that. So everybody endorses that. They say, "Yeah. OK. My beliefs are only going to be as good as – my decisions, rather, are only going to be as good as the beliefs that inform those decisions. And so, yes. If I had something wrong, if one of my beliefs wasn't accurate, I would like to know that.
And the method, you know... the way that you just described it, obviously this is going to reveal to you those things that you believe that need some sort of adjustment or correction... not necessarily a complete reversal but, like, I could clean it up around the edges a lot of the time. The problem for us is that while long-run we understand that that's good, short-run it feels like crap. Because short run, it feels like I was wrong. And some of the reason why we sort of misremember the past and we start to say, "No. I knew that" – and this kind of stuff is to avoid that short-run kind of pain, right, of feeling like I was wrong.
And the problem is that that choice – which is not purposeful, I don't think. But that tradeoff that we're making to sort of protect our ego and avoid that failure being around in the short-run is so bad for us long-run. It would be like human beings who never adjusted and said, "No. The Earth revolves around the sun, and it's OK that we were wrong for a while because now all of our decisions are going to be better in the future because Copernicus came along and fixed that for us." Right? But in our own personal journeys, we're hanging onto Ptolemy all the time. And we want to be able to let go of that.
Dan Ferris: Yeah. It's very difficult, isn't it, to make that kind of a change. I don't know. As I look through even – just look through the table of contents, and I talked about all of your checklists that you have. It looks like you have one at the end of every chapter.
Annie Duke: I have a checklist at the end of every chapter. But I also have a wrap-up. So the wrap-up is just basically like taking, "Here are the really important concepts from the chapter," just to give people like a nice reminder of that.
Dan Ferris: Sure. Yeah. Great learning principle, right? You wrap it up. As you described the process of looking backward and saying, "Well, you should've been doing these things from the beginning," it makes me feel like it's really rare for someone to want to invest a lot more time. Because that's what – there's an investment of time here that folks aren't making if they're not doing this type of work before they make a decision. And I think it feels like... well, most people, they make their decisions quickly and with less forethought than they want to, and maybe they don't turn out as well. But they still resist this idea of investing time in something that may not have this immediate payoff. It's insidious. It's an insidious fact of human nature.
Annie Duke: Yeah. I think there's a few things. I think one is, you know, this is what I... because sometimes I get that and like, "Well, this takes up more time." And I say, "Yeah. But I assume that you would like this thing, you know – you'd like to have a higher probability of the things that you invest your money in to work out. So you might be shaving off a little time, but your expected value, which if you're an investor is the most important thing.
So we want to understand that we'd actually like to – that this kind of... the first piece, right, is that front-end investment and time is going to – it's going to have immediate payoffs, number one, in the quality of the judgment that you're making at the time. And then, it's going to have benefits later in terms of how quickly you can learn... all of which is going to realize into higher expected value, which is what everybody is trying to look for – number one. Number two is that, at least in terms of when you get into like an investment committee, doing this kind of work in-advance will actually save time in the meeting. It makes those conversations much more efficient because it allows you to hone-in on what really matters.
So you do get a time saving. I talk about that in chapter nine in the book. So you actually do pick up some extra – you pick up some time elsewhere by putting your time into these kinds of – this kind of judgment process. That's kind of number two. And then, number three is that overall, actually, this process helps to speed up decisions. Because it gets you to think about the difference – it gets you think about a few things. One is the difference between sorting versus picking.
And we tend to really spend a lot of time on picking between options that have already reached a threshold as opposed to just trying to figure out whether the option passes whatever our threshold is... which particularly for investors – for anybody who's doing anything where you can construct a portfolio so you can do investments and parallel – you really want to be thinking about it as a thresholding problem. So what happens is that we spend a lot of time when we have two things that have satisfied some sort of threshold... trying to pick between the two of them when actually that should be – there should be very little time spent on that part of the process anyway.
So you pick up some time in terms of thinking about that construct. Then the other thing is that there's just a lot of decisions that we go pretty slow on that aren't worth is because the downside to losing a little bit of accuracy isn't going to have a really big effect. And also, if it's reversible – if you can get off the position – you should care less about accuracy in the first place. So these things are just like – these are just generally, like, thinking about the type of decision that you're facing. So when you follow this process, it's true that there's some times when you're going to take more time with a decision. But interestingly enough, you end up picking that time up, back up, by saving time on decisions that don't really deserve that much of your attention.
Dan Ferris: One of the things I definitely wanted to talk with you about is, you have a chapter here that gets into relationships. And you talk about marriage, which certainly is a fairly big decision. And this is the same chapter where you talk about the inside and outside view. And you started off talking about relationship Chernobyl, which is an interesting thought. And I have to say. If there ever is a situation in which the inside view is everything and the outside view just seems ridiculous to most people intuitively, it must be marriage, right?
Annie Duke: Yeah. So, I mean, it's certainly like the decision to get married for sure. So yeah. So just to be clear, the inside view, as I said, is like the world from inside your own perspective. Like the things that you know to be true of the world, the mental models that you have... Obviously this where you're going to be making pretty – you know, this is where cognitive bias is going to live because if you think about something like confirmation bias for example, you're trying to confirm your own beliefs... not other people's. So that's obviously an inside view problem. And then, the outside view is what's true of the world in general.
So the idea is like, "It doesn’t matter if I personally think that the Earth is a trapezoid. It's not. It's still round regardless of what the things are that I believe." So just what's true of the world independent of my own perspective. And included in there would be things like base rates. And then also, we need to recognize that the outside view would also be somebody else's perspective on your situation. So two people could be looking at the exact same information, the exact same situation and they could come to very different conclusions about that information. And it's good to know that and to understand that other people are kind of modeling the world differently than you are in order to improve your own models of the world.
So I use relationships to sort of demonstrate inside-outside view. Because I think it's a place where it's just very intuitive that we can see that going on. So there's two examples that I give of it. The first is, you're listening to your friend talk about like... they've dated...the last 10 people they went on dates with were jerks. And they're going on and on about like, "I can't believe this. Like, people are such jerks," and complaining. And I don't think there's anyone who's ever listened to a conversation like that who doesn’t have the thought cross their mind that – something to the effect of, "Maybe you're picking jerks."
Now, I’m not saying you say this out loud. I’m saying you think this. "Maybe my friend is picking jerks. Or maybe they're picking perfectly nice people, and they're actually the jerk, and they make everybody be, you know, a jerk around them," and they're actually being unfair to those people. But while we all think that, we tend not to say those things out loud. You know, for a variety of reasons. Like, you know... whatever. But that's a good example of the inside and outside views. So your friend is thinking – it's just sort of thinking out like, "All these jerks sort of exist around me."
And that's their view of the world. And you from the outside are able to very clearly see that there are other explanations that would make – that would be reasonable in that situation. And that's just because you have a different perspective than they do. Now, in reality it would be helpful for them to know that you have a different perspective. But we tend to be really careful about expressing those things. The suggestion that I would make in that situation is to say to your friend... not arguing with them about the past but just to say, "OK. When you go out on your next day, how could you avoid picking another jerk?"
And generally, they'll then get to the outside view themselves. Because you're not attacking them and saying, "You're picking jerks," or, "You're a jerk yourself." You're allowing them to think about how they could run that algorithm, that picking algorithm, better in the future. And then, the other example I give – so that's sort of how you're colliding with other perspectives. "How do you see things differently when you're on the outside?" Because we've all been the friend with the 10 jerks in their life and not necessarily seeing it for ourselves.
And then, the other thing I talk about is like, if two people have just gotten married and you were to say to them... I don't recommend you actually do this in real life. This is a thought experiment. If you were to say to them, "What do you think the probability is that you're going to get divorced?" Like, they're literally fresh off the altar. And we know what they would say. "You know, our love is special. We're never going to get divorced. That's why we got married – because this is...you know, we were meant to be. The universe brought us together." You know, all these things.
But it takes – it's super simple to go look at a base rate and say, "Well, no. The base rate in America for couples getting, you know, divorced is somewhere in the 50-ish percentile." So if you were to – if you were to assess that, separately you'd say, "I think there is a 50% chance that they'll get divorced within," whatever time period it is that that base rate applies to. So we can all see that again really clearly when we're looking at another couple. But we know that we don't see that very clearly for ourselves because despite the fact that half of couples get divorced, only 5% of them have any kind of prenup.
Dan Ferris: Right. Well, we've actually been talking for a while. And it always goes by very quickly when we talk with you. So I only have one more question for you, Annie.
Annie Duke: OK.
Dan Ferris: Same question I ask all my guests. Same final question. If you could leave our listener with just one thought about decision-making and everything we've been talking about, what would that one thought be today?
Annie Duke: Yeah. So that one thought is this... that, as I said, your decisions are only ever going to be as good as your beliefs because your beliefs inform all the decisions that you make. And given that we have this inside/outside view problem which we just discussed, it's really good to get the perspectives of the other people. Just turns out that we're pretty bad at doing that. And it's because we do this one thing, which kind of ruins the whole game... which is, when we ask for somebody's opinion, we usually offer our opinion first.
So if I said you like a political article or something, I'll be like, "Hey. Will you look at this article? But they're really biased, and they're ignoring all of this other data. And I think that they're a, you know, political operative masquerading as" – whatever, I'll tell you everything that I think about the article. And then, I'll say to you, "Well, what do you think?" And that's true. Like, you can think about how hiring discussions go, right? Like, you interview somebody, and I go to ask your opinion. And I say, "Oh, yeah. I just interviewed, you know, Morgan. And I think, you know – obviously, their resume is amazing as like a data analyst.
But I’m concerned that they're a little bit abrasive, and they're not going to be particularly good in terms of offering helpful feedback with the team and," you know, blah-blah-blah... go through all my thoughts. And then I'll say, "Well, what did you think about them?" And that's just kind of the natural way we think. But once I've offered my opinion to you, I've kind of ruined the whole game because it's unlikely I’m going to actually find out what you really think. So here's what I want to leave people with. "I interviewed Morgan. I know you did as well. What did you think?" Literally period. Don't follow it with any other sentences. Just let the other person tell you. You'll be a lot better off.
Dan Ferris: Oh, that's exquisite advice. Thank you for that. That's great. I can't wait till your next book comes out so we can have you back again.
Annie Duke: Well, thank you. I'm actually in the middle of the proposal for it. So I already have it planned, so I am going to be writing another book, I guess.
Dan Ferris: All right. Well, as soon as it shows up from Amazon and I can read through it, I will – we will be giving you a call.
Annie Duke: OK. Awesome. Thank you so much. This was so fun.
Dan Ferris: Thanks again. Bye-bye for now, I guess.
Annie Duke: All right. Bye-bye for now.
Dan Ferris: Boy, that was great. Love Annie Duke. Love her books and her thoughts and the idea that we need to make better choices and that there is a way to do it and that it mitigates some of the, you know, cognitive biases and just, you know... noisy emotions that get in the way of these things. Especially as investors. There's actually – I have a book around here somewhere called The Investment Checklist or something like that. I don't even know if I've read it. There's too many books in here. There's like 1,000 books in here at this point. It's getting a bit much.
But checklists are a good thing. I know Mohnish Pabrai of Pabrai Funds is into checklists. And I think it's a good idea. A checklist is one of the ways that you know you actually have a strategy other than just like reacting to cocktail party chatter or news stories that you've read. If you have that checklist that you can go back to that grounds you in a workable, you know, strategy, you're way ahead of so many people. Great talk. Wow. OK. Let's check out the mailbag.
[Music plays and stops]
My colleague and friend, Dave Lashmet, is on fire right now. His average close pick this year alone has returned 187% – almost triple your money.
Today, he's got a time-sensitive $13 stock pick that he believes is set to explode. This is an opportunity you don't want to miss. Listen to Dave's take, along with all of his evidence on the stock over at investorhourtech.com. Check it out.
[Music plays and stops]
In the mailbag each week, you and I have an honest conversation about investing or whatever is on your mind. Just send your questions, comments, and politely worded criticisms to [email protected] I read every word of every e-mail you send me, and I respond to as many as possible.
And really, it's like as many as I think I need to respond to, because sometimes there's a bunch of great ones, and sometimes it's like, "Eh." Had some pretty good ones this week. But the first two are from Gene L. and Michael S., and they're questioning my patriotism. Gene L. says, "When are you going to gain a big of patriotism and stop touting Chinese stocks? Of course, blood money is just money until it isn't. -Gene L."
Michael S. had a longer e-mail here. He says, "Regarding readers' claims of election fraud, our entire system of jurisprudence in the U.S. is based on a person's being innocent until proven guilty beyond a reasonable doubt. This is accomplished through the rules of evidence, which can be quite stringent. In light of the fact that there's no admissible evidence of any fraud whatsoever – which is not even like true. But fine. No admissible evidence of any fraud whatsoever."
Oh, admissible. Maybe you're right about that, Michael. "And the whole accusation is based upon the claims of the biggest liar in history" – also impossible that that's true. "It is up to those who have a public platform to confront anyone who would convict based solely on an accusation. A real patriot stands up to those who would overthrow our Constitution, denying its wisdom. -Michael S." I couldn’t disagree with you more, Michael, because I don't own you diddly squat. I don't have any kind of obligation to confront anyone about any damn thing. This is a pervasive thing in our culture where you think because, you know, somebody's a professional athlete, professional entertainer, I've got a microphone in front of me, I do a podcast, I'm a news guy, I'm a famous anything," that I owe you diddly squat.
And I don't. You know what that's called? That's called think for yourself. Think for yourself. You say too many people have punted on this issue saying, "Well, I don't know about that," or some other mealy-mouthed response. I don't know about it, and neither do you probably. That's the thing that shows your integrity. Your integrity is based on saying, "I don't know," when you don’t know. Obviously, this burns me up. Talking about politics rots your brain, and I'm going to leave it at that. Oy. Next is from Al M. who is a very frequent correspondent. This week, like many weeks, he wrote us two e-mails.
And, sorry, Al. I mean, your e-mail's pretty long, so I can't do the whole thing. But I'll do as much as I can here. It says, "Dan. A little story. When I retired in 2008, I decided to study economics to understand what is going on with money. I had no left or right leaning – simply just didn't know why all this stuff was happening and what should be done for America. 'How does the system work,' was my objective that is, understanding. So I probably read 100 books and worked 10 hours a week seven days a week trying to gain an understanding. Fortunately, I love this stuff. It all comes down to common sense."
Mind you, this is the guy who read 100 books, 10 hours a day. OK? He says, "It all comes down to common sense. You can't print an economy. The more you print, the less the money is worth. You can't spend to save an economy because it increases moral hazard. Pretty simple, and it's all backed up with simple observation of what has worked and what hasn't worked."
OK. Good point, Al. He continues. And this is a very nice compliment. He says, "My suspicion is, Porter senses what is coming, and that's why you are doing both the Investor Hour and the Digest. It will likely be what holds Stansberry together when the excrement hits the fan, the Fed has created hits the fan. I also do believe these Fed dudes, Bernanke, etc., are the worst possible criminals in America. I read Bernanke, Geithner, and Paulson's book, Firefighting. I'm convinced they had someone write it just to protect their asses from prosecution. Anyway... Love all your shows and effort. And please keep it up. It brings that reality that is so important to one's sanity. Al M."
First of all, thank you, Al, for suggesting that Dan Ferris is the glue that would hold Stansberry together if things got hairy. I don’t think that's true, but the fact that you think it's true is quite a compliment because you're a very thoughtful guy. And I'll let the rest of your comments stand on their own. I agree with you. It is fairly simple over time. And like I said. I’m going to let that stand on its own. Very good. Thank you, Al.
OK. Next comes Tom S. He says, "Long-time listener. First time communicating. Alliance member. I just want to thank you for the podcast. I was initially disappointed when Porter Stansberry went off the podcast and you came on. Now I believe you do an equally great job, and for me even better as noted below. I appreciate – I especially appreciate your humility that is so sorely needed in your industry and in life. Part of the result is that you have taught me that no one hits the ball out of the park on every trade or investment and that even the so-called 'great investors' have big losers here and there. We amateurs tend to think you pros tend to make every trade a winner, and now I realize that no one does. So it makes it much easier to deal with stocks that are losers, get rid of them, move on, not blame myself for not predicting the future, let the winners run and not walk away believing I don't know what I am doing. And the paradox is that as a result, my outcomes are much better."
And I'll just say I think he means paradox because... in other words, he's selling lots of losing investments, and yet he's making more money. He continues. "Also, I appreciate the minimal use of hyperbole. Having used other financial newsletters over the years, I still get e-mails advertising the secrets to 10,000% gains, blah blah blah. Also, just thanks for all the education. Love your interviews and your rants. Thanks, Tom S."
Thank you, Tom S. And your point about, you know, every trade being a winner, not being a winner... basically what you're talking about is risk controls. We talk about that a lot on this program, and you seem to have discovered it. And the fact that, you know, you're saying that maybe we helped you with that is really great too. Great, great questions. Great e-mail. Thank you. Next comes Ron Z. And Ron says, "Hi, Dan. Love your weekly investor hours. A long-time friend from your days on the 12% Letter."
Boy, that was a long time ago. "So DRIPs." That is, dividend reinvestment programs. He says, "Given the current 0% trade commissions, do DRIPs still serve a purpose in your opinion? Obviously there' s full dividend reinvestment feature. But there's also the processing delays and the fluctuating purchase prices. What sayeth thou? Thank you, Ron Z." I sayeth unto you, Ron Z... you know, I tried to look up really quick, but I couldn't find the e-mail. I'm pretty sure that when I had my DRIP turned on – like, your broker will do this for you.
And I'm pretty sure when I had my DRIP turned on I wasn't paying commission on those other shares. I don't think I was. And I specifically asked them that question. I'm pretty sure they told me I wasn't paying commission. So I don't think that really plays into whether or not you do a DRIP. Do you want to reinvest your dividends or don't you? And the fluctuating prices shouldn't be an issue because they should be reinvesting pretty soon after you get the dividend. That's all I have to say about that.
Brendan K. says, "I have become increasingly convinced that allocating a small percentage of my net worth to cryptocurrency is a wise decision. My question is, how do I go about doing this? Any guidance would be much appreciated. Keep well, Brandan K." So, Brendan, you mentioned you're not interested – elsewhere in your e-mail, you said you're not interested in trading. You just want to hold it. And you found two of the most recommended sites were Coinbase and Robinhood. I use Coinbase. That's all I can tell you.
Like, I'm not going to make a recommendation. I'll just tell you what I do. And also, I plan on getting some crypto, come bitcoin and Ethereum... are the only two I own. And I buy them to hold them. I don't trade them. And I'm going to put them in a wallet so that they're not hackable. They're not connected to the Internet. But actually, Coinbase has most of its crypto offline anyway. So hope that's useful to you. I use Coinbase. It works fine for me. Brad M. is the last one this week.
And he says, "I'm on the same page with you about the reversion to mean regarding value stocks in terms of performing well over the coming years. I believe further collapse in confidence of government will push smart money out of government bonds into equities, and also tangible assets like gold and real estate." Going to stop you there, Brad. I don't know if a collapse in confidence of government necessarily would lead to more money going into equities because I think that confidence comes as a result of efforts to stimulate the economy, resulting in lots of spending and lots of new debt and lots of new money being created.
And so, I would suspect that if people really lost confidence in government, they wouldn’t be buying equities. He continues, "Bitcoin has a place as well as temporarily until government decides on their own version" – going to stop you there again. The conversation is always framed as, "What's going to happen when the government makes bitcoin illegal?" You know, they're working on this FedCoin or FedCash or whatever the heck it's called. I don't even know. I don't even care. Look. That's not the perspective I have.
My perspective is, the Fed knows they have to get into that market and compete with the other 2,000 cryptocurrencies and especially bitcoin... you know, the big one. And it's too late. And bitcoin is outside the government. It's outside their system. They can outlaw it all they want to, and it's encrypted. We're still going to do it. I don't know. I feel like people don’t understand bitcoin if they ask these questions about the government and think the government's going to somehow make it go away.
You're also asking about value and blue chips and what funds to buy. I'm not going to answer that question because, as it points out, it gets too close to me giving you personal advice. Sorry, Brad. But thank you for those questions or comments about bitcoin and bonds and equities and all the rest. He says, "Thanks so much and keep up the informative podcast." Will do, man. Will do. That's another mailbag, and that's another episode of the Stansberry Investor Hour. Hope you enjoyed it as much as I did.
If you want to hear more from Stansberry Research in the realm of politics, check out americanconsequences.com/podcast. Do me a favor. Subscribe to our show in iTunes, Google Play or wherever you listen to podcasts. And while you're there, help us grow with a rate and a review. You can also follow us on Facebook and Instagram. Our handle is @InvestorHour. Our handle on Twitter is @Investor_Hour. If you have a guest you want me to interview, drop me a note: [email protected] Till next week. I'm Dan Ferris. Thanks for listening. [Music plays]
Announcer: Thank you for listening to this episode of the Stansberry Investor Hour. To access today's notes and receive notice of upcoming episodes, go to investorhour.com and enter your e-mail. Have a question for Dan? Send him an e-mail. [email protected] This broadcast is for entertainment purposes only and should not be considered personalized investment advice. Trading stocks and all other financial instruments involves risk. You should not make any investment decision based solely on what you hear. Stansberry Investor Hour is produced by Stansberry Research and is copyrighted by the Stansberry Radio Network.
[End of Audio]