People are bad at estimating risks properly. We have many biases that make us underestimate real risks or overestimate false risks. At present I choose to focus on the former, although the latter is not without its own costs (mostly through avoiding experiences). There are risks we are generally aware of but use self-deception in their analysis so we can ignore them and keep doing what we're doing now, and there are risks we don't even consider as risks, either due to lack of information or due to self-deception.First I'll address lack of information. If we don't even know that something is risky, how can we use any method at all to estimate that risk? I propose that generally it is helpful to examine all activities that we do as potential risks, however, due to limited time, we need to prioritize our activities so that we'll spend our time analyzing the ones that are actually somewhat likely to turn out as risky. I propose a prior over activities that combines our initial uncertainty (our gut feeling: how uncertain am I that this is safe?) and prior knowledge from other people. This way, an activity that many other people do will be prioritized lower than an activity that only I do. In a conversation, sugar consumption was raised as a possible counterexample. That's somewhat valid, but not a real counterexample: yes, under my heuristic, I would prioritize doing risk analysis for a reasonable amount of sugar consumption lower than I would prioritize dropping down to zero sugar consumption (which is more unusual), if there existed no prior knowledge about this at all. I think it's a bad example because we do have a lot of prior knowledge about the risks of sugar consumption, so either we consider the decision as it would be made today (in which case we would prioritize risk analysis for sugar, because although many people consume sugar, it is also well known that it could be bad for us), or we consider it in the context of many years ago, before the risks associated with sugar consumption became known, in which case I argue that yes, it makes sense to prioritize risk analysis for zero sugar consumption over risk analysis for sugar consumption in quantities similar to most people, and that this decision only seems wrong when examined with the knowledge of today. It would not actually be wrong in the past context. Another approach I propose to identifying unknown risks is examining our own lives and other people's lives, looking for negative outcomes, and going backwards from there: why did it happen? What were the mistakes (if any)? How can this be avoided in the future? It is important to try to generalize as much as possible, in order to be able to identify a large group of risks using a single observation. What about self-deception? We can use self-deception either to conclude that a risky activity is not as risky as it truly is, or to hide from ourselves the very possibility of it being risky at all. As a general statement about self-deception, I believe that we can identify them by a careful examination of our feelings – we know when we're not honest with ourselves. Regarding risks, specifically, the second case (holding an explicit belief that an activity is safe when it is in fact risky) is similar to the case of unknown risks from above, so we can apply the same heuristics to the identification of risks, and once we examine them, it should be low-cost to expose them as real risks, since the main barrier is ourselves rather than a real lack of information. The first case (deceiving ourselves about the magnitude of the risk) is even easier, since these activities are already flagged as somewhat risky, and we just need to be honest in analyzing the risk involved. This is not easy at all, of course, but it falls under rational risk analysis, which is a basic requirement for all of the above. Rationally analyzing risk is not easy. In most cases, it is hard to accurately estimate the risk and compare it to the benefits involved. I do claim that doing a simple (not decision-theoretic) analysis consisting of listing the known risks and the known benefits, taking the time to truly look into the available data (the more scientific, the better – but of course, scientific information is often wrong, too), and coming up with suggestions on reducing the risk, is much better than nothing. An example from a recent conversation would be biking: biking is easily flagged as risky because (1) it is generally considered to be risky although many people do it (similar to sugar in that sense), and (2) when we bike with friends, we often hear their stories about past injuries and accidents, and we also see evidence of accidents around us. We may still underestimate the risks due to self-deception: we may be biased because we only get to hear accident stories from people whose accidents were not severe enough to take them out for good, and of course we don't want to look for the really bad accidents because we're perfectly happy with thinking that biking is safe. If we do take the time to examine the risks of biking, we may realize that there are risks from cars, from the roads themselves, and from bad equipment (or so I'm told). We can arrive at the conclusion that if we only bike on good roads with no traffic sharing the road, and if we make sure to use good equipment and keep it in good shape and ride with partners, then biking becomes safe. I'd like to do this kind of analysis on many more aspects of life. We don't have to give up activities: many activities can be tweaked to make them safer.