Facts or Fear: The Case for Facts
Sue Desmond-Hellmann
2017 Rede Lecture, Cambridge University
June 12, 2017
AS PREPARED
Thank you for that lovely introduction on my first visit to the University of Cambridge. It's an honor to be here.
I brought something with me to share with you today. You've all seen speakers with props but don't worry, it's not a widget, I won't be doing a demo. I just want to share with you a treasured object that I've owned for 25 years. It's a handmade cherry wood bowl.
Like natural cherry wood, it gets darker and more beautiful every year. And the one distinction of this, well, it's never had a lettuce leaf in it, even though it's supposed to be a salad bowl.
I'm not one to carry things around when I move place to place, but I've always had this cherry wood bowl. It's a gift that I was given by a widower. A widower, who was the husband of one of my patients. I'll call her Erica. Her husband gave me this bowl, which he carved himself, after I failed to save his wife with breast cancer, leaving him and a one-year old baby behind.
Initially when I had this bowl, and still do this day, I'm almost embarrassed to have such a beautiful thing, knowing that I didn't deserve such a gift, as I struggled and failed to come up with a remedy, something to slow or to solve the breast cancer that took Erica's life.
I knew then, 25 years ago, that there was a scientific driver called HER2 or ERBB2. In fact, former Rede Lecturer Harold Varmus - with Michael Bishop - had outlined oncogenes, these growth drivers, that made this cancer the most difficult kind to treat.
But 25 years ago, all I could do was get a research grade diagnostic test, and know that that cancer was a different. Worse. Scary.
The irony of that experience and my grief and sense of failure was that only five years later, I led the team at Genentech that developed Herceptin, a specific, science-based remedy for that very cancer.
And in the midst of my happiness and immense gratitude to Dr. Varmus, Dr. Bishop, and so many other scientists in the world who were behind that discovery…
…including your very own work in monoclonal antibodies, I might add…
…in celebrating that science, I only wished…
…I only wished that the science had been there in time for Erica.
I love science. When I was a kid I was more math-y than science-y. But my experience as a practicing cancer doctor and as a product developer means that I love science now more than I did when I was a kid.
And I love the scientific method. I love every bit of it. And the reason that I love science and the scientific method is that I've seen how much science matters for patients.
I've seen how much it changes lives. And I have a sense of haste, a sense of pace, a sense of urgency, because I've seen firsthand what happens when you don't have something and your patients are waiting.
And now at the helm of the Gates Foundation - as CEO - I love science even more. My colleagues and I, particularly in global health, are working with innovators across the world. And we are tapping into that same sense of urgency, that same sense of pace, to use innovation – science - to make the world safer and to advance equality.
And we believe - and I believe - that great science will markedly decrease pain and suffering and end up making the world a better place.
So I care deeply that we honor and value science. I think of science as something incredibly precious. And I want us to remain relentless, heads down, driving with a sense of urgency, to advance scientific knowledge in the quest to improve the human condition.
Now, you might ask yourself, “Why is she starting her Rede Lecture by telling us how much she loves science?” And my title today is Facts or Fear: The Case for Facts. So what am I so worried about, what's going on?
Here's the thing.
I have had a front row seat for the transformation of cancer therapy. It's incredible to have started my oncology fellowship - and get into the science and background of cancer - only in 1986.
But in my professional career, I’ve seen cancer therapy move from disfiguring surgery - radical mastectomies, the thought that all we needed to do was cut out more of the cancer…
…to powerful poisons - I was terrified the first time I gave a patient chemotherapy…
…and now to more precise approaches, targeting genes.
All thanks to science.
In fact, we've all had a front row seat for incredible advances in science and in using the scientific method to improve life. Indeed, many of those things Cambridge has been a part of. And life is better because of those improvements.
We can even measure how much decades of scientific improvement have directly contributed to improved quality of life through extended life expectancy.
It's remarkable, if you look around the world, how life expectancy has increased and improved. We've even eradicated one human disease: smallpox, has been wiped off the face of the earth. And I can tell you we're right on the brink of more.
Just a few weeks ago, I saw Bill Gates and Jimmy Carter trash talking each other about whether polio or guinea worm would be the next human disease eradicated off the face of the earth.
I didn't pick a winner; I hope they're both right.
The point of this is modern technology, science, and innovation is better now than it's ever been. As great as these last few decades have been, as fantastic as the last century has been for science and innovation and improving life, I'm very confident that in the next several decades we have in our grasp bigger discoveries, more breakthroughs, better things.
But scientists these days are nervous. The scientific community, those of us who love the scientific method, feel like we're at risk. And the specific risk is that the scientific method cannot, with pace, improve the human condition in ways that we all dream of and expect.
So what's behind my worry?
Just this past November, the Oxford English Dictionary named “post-truth” the 2016 word of the year. Post-truth. They defined post-truth, as “…relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”.
“Objective facts less influential in shaping of public opinion than appeals to emotion and personal belief.”
We appear to be living in a post-truth era. In 2016 President Trump was elected United States President, having challenged both climate science and vaccine safety. Your very own pro-Brexit MP, Michael Gove, asserted “people in this country have had enough of experts.” And on April 22nd, 2017, more than a million people around the world protested in the March for Science to defend the role of science in policy and society.
Look, I'll grant you that this whole anti-expert thing, this whole anti-expert sentiment, it might be temporary. This might be a passing fad. And the scope of topics, while they're profoundly important - we're talking about things like climate change, GMOs, vaccines, these really matter - but frankly, given all of the science and innovation, all the things we all care about, it's still pretty narrow.
And most people still actually like scientists. We come in right behind the military when the Pew Trust asks people who they trust.
So, we're okay for now, but if – as I would contend - it's a net benefit to society to value science and expertise, the scientific community needs to remain relevant. We need to argue against a post-fact era, a post-truth era. And we must participate effectively in the public dialogue that's going on around facts and truth.
That's why I picked this topic to talk about today.
Like all of you, I'm ambitious about what science needs to do and the problems we need to solve. And all our science, all of our work, the innovation we all want to drive, won't matter if the politics, the policy and society isn't directly influenced by what we consider truth and facts.
Today, I want to talk about three parts of this debate that I think are most crucial for us as an academic and scientific community to engage in, to make sure we're relevant in those discussions about truth and facts, given the times we're in.
The three things are consequences, confidence, and credibility. Consequences, confidence and credibility. Because I think these three aspects of this debate help frame how I think we should show up to be most effective.
Let me start with consequences. Consequences sort of sounds like I'm going to scold you or there's a punishment or something involved. But don't worry, there's no punishment.
I want to go back to the march and some of the discussion that I'm guessing happens in the dinner halls around here and in the hallway conversations. It's easy for critics to accuse scientists of overreacting. It's easy to say, “Oh, those scientists, they overreact to any pushback of their authority,” to categorize their concerns as arrogance or entitlement. “There they go again. Scientists, you know, they're just so delicate.”
In fact, in a poll about the March for Science, 44 percent of Americans the protest would make no difference and seven percent said the protest would actually hurt the cause of the scientists. That’s 44 percent said no difference, and seven percent said you're working against yourself.
But here's the thing: Scientists are actually used to skepticism. This is normal for us. In fact, we use the term “healthy skepticism”. Healthy skepticism, it's part of the way we operate. We talk about healthy skepticism to assert that nothing should be accepted or rejected without considerable evidence.
But healthy skepticism is fundamentally different to denialism. And that's what we're facing today. Denialism is the abject refusal to accept established facts. Abject refusal to accept established facts - denialism. And denialism itself has real consequences. That's why we have to speak up. That's why it matters.
Denialism of climate change threatens humanity. It threatens humanity with life threatening weather extremes, lack of agricultural productivity that can lead to hunger, thirst based on lack of clean water or any water at all, and the health impacts of poor air quality.
Between 2000 and 2005, AIDS denialism in South Africa led to an estimated 330,000 deaths.
And just recently in the United States state of Minnesota there was a measles outbreak. This is vaccine-preventable measles. And this measles outbreak just tells you, measles is more contagious than virtually any other disease. You can leave a room having had measles, someone can go in the room and get measles. This is a super infectious disease, but vaccine preventable.
Now, in Minnesota alone in the eight weeks ending June 2nd of this year, there were more measles cases than had occurred in the entire United States in 2016. In eight weeks. In one state.
Getting scientific facts right matters. There are consequences to human health and the human condition of denialism, of denying what we know is true.
And the best scientists and the best innovators are fully aware of consequences. We actually do something I would consider self-editing. And I want to make sure that I'm really clear with consequences. Consequences can be both positive and negative, consequences to how we think about policy and society with reference to innovation.
For instance, I'm on Facebook's board of directors. Mark Zuckerberg famously launched Facebook when he was still a college student, and when he did, he had a motto that's pretty widely known, and it's hanging on posters on the wall at Facebook. The motto is “Move fast and break things”. Move fast and break things. A vigorous dynamism reflecting a founder and entrepreneur who created social media: run into an obstacle, change it, fix it. That’s move fast and break things.
But here's what Mark Zuckerberg announced at the Facebook Developers' Conference in 2014, a decade after he had founded Facebook. He changed the motto to “Move Fast with Stable Infrastructure”. Not quite as sexy, is it, not as catchy. And I don't think it was just because he hit 30 that year.
Mark Zuckerberg talked about move fast with stable infrastructure because by then Facebook had more than a billion users. And efficiency and reliability became more important to developers than speed. Breaking things suddenly had a consequence.
Similarly, one of the scientists in the world who first described the gene editing CRISPR technology is a colleague of mine from the University of California, Jennifer Doudna. She and other colleagues first started talking about CRISPR in 2012.
I'm pretty sure you guys all know about CRISPR, but for those of you don't, it's this remarkable, new, precise gene editing technology. There's a newspaper debate about how precise right now, which is really fun if you love science. But inarguably, CRISPR shows immense promise.
In 2012 Jennifer Doudna started talking about curing genetic diseases with CRISP technology. Curing genetic diseases. But Jennifer Doudna was also a part of a David Baltimore-led 2015 scientific coalition that put in place a worldwide moratorium on gene editing that could be passed along to subsequent generations of humans. So, she and her colleagues in 2015 self-edited by saying, we will not use CRISPR on the human genome.
They were worried about consequences.
Now, as you heard, I went to the biotechnology company Genentech in 1995. 1995 was also the 20th anniversary of a prior self-edit. In 1975, Nobel awardee Paul Berg led a conference that happened in California at Asilomar. In 1975, recombinant DNA technology had been described about a year, and the world was nervous about it. So, a group of scientists met at Asilomar and made a pact that they would not utilize recombinant DNA technology until there was more proof of its safety and efficacy.
Biotechnology companies started - the first one was in 1976 - knowing about that premise of slow and steady rollout of all the promise and all the power that we all use every day today of recombinant DNA technology. Self-editing.
Now, I like these examples because what both Mark Zuckerberg and Jennifer Doudna demonstrated - and was demonstrated much earlier by scientists in '75 and '76 - is their own sense of personal modesty and responsibility.
The scientific community is right to speak up when the world rejects their own consensus views, but speaking up will be enhanced by scientists' ability to put their own decision-making, their own sense of prioritizing consequences in context, given the societal consequences of adopting new innovations too rapidly, or not rapidly enough. Both are consequential.
My own knowledge of consequences comes from being a product developer.
I've told you the very positive story of being a part of Herceptin's product development. But I was also a part of another product development where I got an even earlier look at potential consequences. In 1971, Judah Folkman published a paper in the New England Journal of Medicine. It was a paper that was quoted every year when I went to the cancer meetings. It was a paper that posited that cancer could grow no larger than a ball bearing - two millimeters - and it couldn't spread unless it made new blood vessels.
That 1971 paper was Judah Folkman's first description of the dream that anti-angiogenesis would allow cancer doctors to cut off the blood supply and starve tumors. It was pretty cool, and we all liked the hypothesis, but by the time the mid-'90s came around, poor Dr. Folkman had been to many cancer meetings and literally was greeted with a kind of, “Oh, there he goes again.”
So I was delighted when in the late '90s at Genentech we had identified VEGF, vascular endothelial growth factor, and using monoclonal antibody technology, we made what's now known as Avastin, anti-VEGF or Bevacizumab.
I remember like it was yesterday the terror in my mind and the sleepless night I had when the first patient on earth was treated with anti-VEGF. Imagine for yourself how would you like to shut off the protein that is the most important protein for the growth and stabilization of your blood vessels. I had this nightmare that all blood vessels would collapse, horrible things would ensue, despite the animal studies.
This was the first human on the face of the earth treated with an anti-angiogenic. Consequences.
Several years later, we did one of the most important studies with Bevacizumab, anti-VEGF, in patients with lung cancer. And we saw this incredible powerful response in some patients, so much so that a few patients died while coughing up blood. The necrosis, the death of their cancer was, as Judah Folkman had foretold, so powerful there was bleeding, life threatening bleeding because it was lung cancer.
We had a discussion with FDA moving from that limited clinical trial setting in lung cancer to actually selling Avastin all over the world, starting in the United States and then approving it in Europe and beyond. I can tell you on my mind, just as it was when the first patient on earth was treated with anti-VEGF, we have several hundred people but not a lot of information on long term safety. But patients were waiting. Patients with lung cancer, colon cancer and other cancers needed a new therapy.
That sense of consequences, that sense of responsibility for me is part of being an effective scientist. Slowing down a launch of a new product means patients go without a potentially helpful new therapy. But if the net clinical benefit is a positive, that's okay. If there's a rare safety issue, you find out late. Thousands, tens of thousands, hundreds of thousands, or in some conditions millions of people could be at risk.
Confidence is the second thing I want to talk about.
You can’t talk about consequences without talking about confidence, because the two are so essential if we want to be spokespeople for the truth.
Confidence is a critical part of being a scientist, and here's the thing that makes confidence an even more important part of being a scientist than when I trained. The global nature of business and the rapid spread of information through the internet means that decision-makers and policy-makers often face significant consequences when they rely on scientific opinion or consensus.
Now, you all know that social media takes that rapid dissemination and just puts it on fire. It turns out that social media allows for rapid spread not just of facts and truths, but it also seems a particularly good way to rapidly spread myths, misinformation, and misunderstandings.
Given that, innovators, no matter what we do, if we're innovative, if we're doing new things like we do in science, our ability to appropriately assess confidence is an essential part of how we show up.
Now, I hope most of you in this audience know about the scientific method. So forgive me if I'm telling you something that's obvious. But one of the essential elements of the scientific method is that every study is just our attempt to simulate what the real world would do and replicate things as they are.
We're predicting what would happen if we could do an experiment repeatedly. It's normal that we say this needs further study. Scientists' favorite thing to say is, “It should be reproduced, it needs more study.” It's normal that we would do that and that we would try and have further inquiry to sharpen our expert views of what's really happening.
But we don't want to freeze policy-makers. We want policy-makers to be able to understand and to value what we say and what we put forth. We want leaders to encourage and favor research findings that are favored by multiple lines of evidence - consensus, where more people agree with that evidence. And there's more scrutiny. We want policy-makers to favor that kind of evidence where we use words like “multiple people”, “repeated experiments”, over even exciting findings that are supported by less information or less scrutiny.
Too often, investigators think of confidence intervals as merely a statistical question, or even worse, as a single P value. Now, I know that this was the home of Fisher, so I know I'm on hallowed ground peaking of P values at Cambridge. But here's the thing: even Fisher never meant for a single P value to tell you truth.
There was a great article in Nature written by Regina Nuzzo. She wrote about the tyranny of P values, pointing out the limitations of a single answer. Really all that conveys is that you have rejected the null hypothesis. That's all it says, the P value.
Now, if you're judging that your evidence is worth a second look, you should actually look to Johns Hopkins statistician Richard Royall, who recommended that scientists ask three questions on confidence: What is the evidence? What should I believe? And what should I do?
What is the evidence? What should I believe? And what should I do?
When we utilize vaccines like MMR, we have great confidence that millions of children over decades have been protected with these products. Not novel vaccines, like the rotavirus vaccine, launched this year in India for treatment of diarrheal disease. Now, that's been prequalified by the WHO after large clinical trials, but it will be essential to have post-marketing safety monitoring of rotavirus vaccine to build great confidence as more children are treated.
Our confidence in all scientific findings, ultimately, should be judged on whether the findings make sense and what they mean in the context of the clinical environment.
That's consequences and confidence. And it brings me to what I would say is the most important thing in the current discourse in this “post-truth” era: Credibility.
If scientists want our messages to matter, perhaps no topic is more important than credibility. And you all know, credibility has two side: one is trustworthiness, the other is expertise.
Scientific credibility comes from scientists adhering to the scientific method. Evidence of competency, like an advanced degree, or certification. Prestige of your training institution. And success within the scientific literature, peer-reviewed system.
Now, I want to mention on the subject of credibility that there's been a lot of global debate over the importance of conflicts of interest, both financial and non-financial, and doubt over whether transparency is enough to overcome conflicts and decrease doubt.
My own experience with conflicts of interest might be instructive to you. As you just heard in that lovely introduction, I was a scientist and a physician long before I went to private industry. And I remember extremely well my defensiveness the first time someone accused me of being a drug-company lady.
I spoke up and said, “No, you have got me wrong. I didn't take off that white hat as a doctor and a scientist and put on some black hat. I'm a doctor and a scientist. I care deeply about the patients I'm trying to serve, you're wrong about me.”
But, actually I evolved as I spent more time in industry. And by the end of my tenure at Genentech, I would introduce myself to people at a medical meeting and I'd say, “I want to debate you. I want to debate you on this scientific topic. But before we start the debate, let me tell you I work for Genentech, so you should put an asterisk on what I say. You should know that I benefit financially, professionally, in every way, if the things my company is doing succeed.”
I believe that all of us need to take conflicts of interest extremely seriously and admit up front, it's not enough, but it's important. We can't solve for the biases that inevitably come from our quest for fame, glory, money, intellectual property, power. We all have conflicts. But conflicts of interest and publicity about scientific literature retractions, about overt fraud have eroded the esteem that the public has for scientists. It's actually kind of sad.
But lack of esteem for experts and elites is not limited to scientists. There's actually emerging literature really over the past 5 to ten years in particular that traditional sources of authority in the world - clergy members, heads of state, your local business owners, CEOs, physicians…all traditional authority figures have become diminished in our world.
Instead, the public often relies much more on something that I would call “identity” rather than following the lead of their head of state, their priest, or anybody who would be a traditional source of knowledge.
What do I mean by “identity”? Identity is something that might have been called tribalism in certain cultures. Identity might be traditionally associated with a political party or a religion. But identity, in a time of social media, has become much more about taking sides. About who I am, about how I see myself, about my opinion on things, including scientific things. Opinions driven much more by my identity rather than what anyone in this room tells me about the science.
Now, there's a term for this called “echo chamber”. Identity magnifies the traditional way that we think about echo chambers. Echo chambers, like a lot of other things in our current world, are highly prone to the network effect. So, when a harmonious group of people are sharing, typically using social media, they start to share similar views and develop tunnel vision.
And this phenomenon is particularly highly-powered by the kind of social media algorithms that Google, Facebook, and Twitter use. They drive specific information to an individual's online feeds. So, that identity becomes much more significant and powerful than whether some expert or some elite told me the facts.
As such, I would submit that expertise, specifically expertise in communication, is one of today's important credibility challenges for the scientific community.
And here's what's ironic. Think about this. Who's good at convincing people and influencing people?
Business? Business is great at influencing people. If you use Amazon, which I do, they know what kind of sneakers I like. Amazon are really good at knowing what I'm going to buy. But businesses of all sorts have gotten extremely good at influencing. Politicians? Politicians, especially some politicians recently in the United States - but this is a global phenomenon - can influence how you think. And the media? The media themselves, particularly social media, are much more adroit at influence than scientists.
It turns out, scientists are not that great in utilizing tactics to maximize their influence. And this is daunting for us. First of all, as I just told you, nobody considers the fact that we're an expert or we're credentialed, or we went to a great school, to matter much anymore.
More than that, social science research shows that simply offering more facts, saying it over and over, or maybe even louder, or saying, “Do you know who I am?” doesn't work. In fact, it can backfire, even more entrenching that identity-based system of beliefs that tells me how I fit in.
So, if we're at risk of people doubling down on their beliefs, what will we do?
Well, there's actually good news on the credibility front and on the expertise and the communication front. Scholarship around scientific communication, scholarship around decision science, social psychology, behavioral science is increasing. And it's beginning to yield insights into what works.
One of the most promising examples of these concepts comes from right here, from your colleague, Sander van der Linden and others with research that I really like.
Now, I'm not a social scientist, I must confess. I'm out of my league. But I'm going to get good at this. Because if I'm going to have an influence, I need to learn about this area of science.
What Dr. van der Linden and his colleagues have done is looked at the public's beliefs on an area very prone to denialism: human-caused climate change. And human-caused climate change is a topic that matters a lot, has consequences. They've looked at how to increase public perception of the expert consensus around climate science.
What they found is that misinformation neutralizes the positive effect of scientific consensus, it decreases your willingness to listen. Misinformation is powerful. So what he and other colleagues have done is actually inoculate, kind of like a vaccine, inoculate people by warning them about this tactic.
Here's the way people spread misinformation: They'll collect a number of people, and they'll put a petition in front of you with people they claim are actually scientists who say human made climate change is wrong, climate science is wrong.
It turns out that inoculation works. You can protect public attitudes about scientific consensus by sharing with people the way that misinformation is spread. And this inoculation theory of psychology, which dates back to the 1960s, has been championed by a number of Australian scientists, including John Cook, as a key tactic when a scientist confronts a conflict between science and myth.
This has now become known as the cognitive psychology of debunking. And it's really important. It really matters.
It gives me hope that some of our colleagues are really good at understanding how people make decisions. That they're delving deeply into the psychology of identity, and helping us understand what's behind it and how - with honor and respect for people and their opinions - one can overcome myths and misperceptions.
So, colleagues, in February of 2017, there was an interview that appeared in The Atlantic by Jeffrey Goldberg, the editor, and two of my bosses, Warren Buffett and Bill Gates.
In this interview, Jeffrey Goldberg asked, “Look, you guys are such believers in the inexorability of progress. You guys are optimists like crazy. But, you know, it's easy to be an optimist when you're the richest guy in the world and the third-richest guy in the world. So, come on. Do you really believe all this? Look at the times we live in. Aren't you worried?”
When Jeffrey Goldberg put this question to Bill Gates, he said something that I'm going to quote because I think it was just super. Bill Gates said, “I predict a comeback for the truth.” I predict a comeback for the truth.
And he went on to say, “People want success, they want education that works, they want healthcare that works. And so to the degree that certain solutions are not created based on facts, I believe those won't be as successful as those based on facts.”
Not going out on a limb here: I agree with Bill Gates. I think he was so right in predicting a comeback for the truth.
And in the end, here's the thing we all know. We all know - and this was my favorite part of being a clinical drug developer - that in the scientific method we estimate what truth is. We're predicting what truth will be in the end. Predicting the future, estimating truth, is the essence of the scientific method. And it's reassuring to know - and to remind ourselves, maybe, in the times we live in - that the truth wins every time, and facts stand the test of time.
Some of what we're seeing, I think, is the consequence of thinking short term rather than long term. So, I also predict a comeback for the truth.
In the meantime, as truth is coming back and we deal with this post-truth era, how should we all comport ourselves? Having told you about consequences, confidence, and credibility, I have a few suggestions on each.
Starting with consequences.
Scientists need to embrace our own humanity and our own communities. I encourage all of us to know your neighbors. And here's the thing: You know that echo chamber? We've been a little guilty of that. Get out of your own network of friends, especially seek out friends who aren't like you.
Travel and get out of your bubble. You will be better as a scientist at assessing problems, solving problems, and more importantly, understanding the consequences of potential solutions if you understand and respect others. That's the start of having the humility to think about consequences.
Now, humility is a virtue. It's seductive to exaggerate or boast. When you're excited about your stuff, when you have got novelty or you have done something important. And, look, funding is tough. We're all in the game of attracting funding for our causes and what we do. When I was Chancellor at University of California San Francisco, my husband, who's also a physician and a scientist, accused me of becoming a “serial exaggerator”. Terrible. And, actually, if you saw me in front of potential funders… “we have the best students, the best faculty, you have never seen anything like it”. Maybe I was a bit of a serial exaggerator.
But here's the serious part of it: For all of us, diligence in putting our work into the right perspective, the right context, understanding how the work, if applied, will influence practice and policy - and that self-editing I spoke about – all should be a key part of how we introduce others to new things.
Confidence.
Every scientist must, must, must have training and practice in how to think about inference. And that training should not be limited to narrow statistical measures, but include multiple approaches and meaningful approaches to assessment of confidence in study outcomes.
Say it in English. Figure out how you would talk to a non-scientist about the findings.
And here's how to drive more confidence: open sharing of data, teamwork, more dialogue and visibility when a study is negative or not reproduced, will all give consumers of scientists more confidence.
The Gates Foundation has an open access policy. We believe in teamwork. We believe in sharing data. And our open access policy for all our funded work is especially important because if we're going to get out into communities, and include those who can't pay for scientific publications, open access enables that rich dialogue that increases confidence.
Credibility.
Scientists, who have friends who aren't like them, who are engaged in community and family life as a part of civil society, will immediately be more credible and trustworthy.
Formal training in simple, clear, non-jargon-y communication is an essential part of scientific training. And it's an essential part of continuing education. You don't learn everything you need to know about communication in school.
Scientists speaking plainly about their findings will diminish the likelihood of loose interpretations.
And, finally, imagine going to make a pitch to invest and celebrate what some people call - very inappropriately, I think – “soft science”. I don't think there's anything about brain science that's soft.
Decision science, social and behavioral psychology. Start reading about the data in those fields and support colleagues who are bringing novel tools and incredibly interesting new brain science to the way we think about how people make decisions.
Only when we advance decision and social and behavioral science will it allow us to catch up as scientists to a world where people are more connected and linked to their own identity than ever before.
Albert Einstein said, “Whoever is careless with the truth in small matters cannot be trusted with important matters.” Let's make a pact as a community, as a scientific community, that we will not be careless with the truth; so we can maximize our impact on the most important matters. In the end, someone like my patient, Erica, is counting on us to drive knowledge, to drive deep understanding, and in the end, to improve the human condition. Thank you for listening.