Skip to content
BY-NC-ND 3.0 license Open Access Published by De Gruyter January 8, 2014

Bias: a normal operating characteristic of the diagnosing brain

  • Pat Croskerry EMAIL logo
From the journal Diagnosis

Abstract

People diagnose themselves or receive advice about their illnesses from a variety of sources ranging from their family or friends, alternate medicine, or through conventional medicine. In all cases, the diagnosing mechanism is the human brain which normally operates under the influence of a variety of biases. Most, but not all biases, reside in intuitive decision making, and no individual or group is immune from them. Two biases in particular, bias blind spot and myside bias, have presented obstacles to accepting the impact of bias on medical decision making. Nevertheless, there is now a widespread appreciation of the important role of bias in the majority of medical disciplines. The dual process model of decision making now seems well accepted, although a polarization of opinions has arisen with some arguing the merits of intuitive approaches over analytical ones and vice versa. We should instead accept that it is not one mode or the other that enables well-calibrated thinking but the discriminating use of both. A pivotal role for analytical thinking lies in its ability to allow decision makers the means to detach from the intuitive mode to mitigate bias; it is the gatekeeper for the final diagnostic decision. Exploring and cultivating such debiasing initiatives should be seen as the next major research area in clinical decision making. Awareness of bias and strategies for debiasing are important aspects of the critical thinker’s armamentarium. Promoting critical thinking in undergraduate, postgraduate and continuing medical education will lead to better calibrated diagnosticians.

Introduction

Diagnosing and treating illness has been going on for many thousands of years, probably well back into the Paleolithic period. One of the functions of shamans was to heal ailments by extracting the disease spirit from the body of the afflicted [1]. Allopathic (orthodox) medicine evolved from those early beginnings, but other groups have also evolved to expand the options of those seeking diagnostic services. Modern day shamans continue, along with faith healers, naturopaths, homeopaths, and many others. It is estimated that over 38% of Americans used some form of alternate medicine in 2007 [2]. Notwithstanding what is available through allopathic medicine, and what these alternate approaches offer, the majority of common illnesses and complaints are diagnosed and managed through self-diagnosis, often through the internet, or diagnosis by what Friedson calls the “lay referral system” of family, friends and acquaintances [3]. Generally speaking, no group, society or culture suffers a shortage of diagnosticians; however, their main instrument of diagnosis, the brain, operates under an inherent restraining characteristic – bias.

At its heart, the diagnostic process involves an appreciation of cause and effect. When signs and symptoms are manifest the diagnostician makes a connection between them using some frame of understanding that is distributed along a spectrum of evidence-based to faith-based “knowledge.” Understanding cause and effect is not as easy as it sounds. We are ever vulnerable to seeing correlation as causation, misinterpreting things simply because of temporal relationships, suckered by logical fallacies, seeing meaningful patterns where none exist, and are completely fooled by illusions and magicians. These are failings in the way we process our perceptions i.e., in the ways that our brains work when making decisions. The problem may lie in insufficient knowledge (declarative shortcoming) or the process of reasoning and deciding (procedural shortcoming). Studies suggest that procedural shortcomings, the ways in which we think, are principal factors determining diagnostic calibration [4, 5]; these are generally referred to as cognitive factors. Although a distinction is sometimes made between cognitive and affective factors [6], affect is involved in nearly all of cognition, so cognitive factors generally include affective factors.

Dual process decision making

It is now widely accepted that there are two modes of decision making. The intuitive mode is fast, autonomous and frugal. The cognitive psychologists refer to it as System 1 or Type 1 processing and it is where we make most of our decisions in the course of our daily affairs. Kahnemann notes that many of our decisions are made through mindless serial association [7] – not “mindless” in a negative, thoughtless sense but simply the way in which particular patterns elicit particular responses such that it is possible to get through much of the day simply by moving from one association to another. Performing a well-learned act such as driving a car is a good example. Very little active reasoning is involved in getting the car from A to B. However, if parallel parking is required at B then reasoning and judgment is essential to maneuver the car successfully into the parking space. Much of serial association is guided by heuristics which are maxims, abbreviated ways of thinking, rules of thumb, educated guesses or often just plain common sense which reduce the cognitive effort in decision making. They are also effective in situations where information is incomplete or there is uncertainty. They have their origins in the concept of “bounded rationality” first described by Herbert Simon [8]. They are either learned or hard-wired and range from the very simple to more complex, but all can be executed in a fairly mindless fashion i.e., without deliberate thinking. All of us, in all walks of life, live by these heuristics which mostly serve us well. The second mode of decision making involves analytical thinking, referred to as System 2 or Type 2 processing, which involves deliberate reasoning; it is generally slower and more resource intensive.

Given that we spend an estimated 95% of our time in System 1 [9], and given how often we are rewarded for the time spent there, it is not surprising that we have come to rely upon and trust our intuitions. On the face of it, Type 1 decision making is a very good deal – little cognitive effort is required and it is mostly effective. Simply having biases often removes interference and distractions and allows us to cut to the chase with a set response. However, the dark side of heuristics is that biased ways of looking at things will occasionally fail. It is the price we pay for low resourced decision making. Failure can also occur in System 2, but usually when analytical reasoning is not properly conducted due to deficiencies in reasoning, incorrect information being used, or correct rules not being applied correctly. There are occasions too when analytical reasoning may not be appropriate e.g., in an emergency where a very fast response is required, or when too much time is spent on the process leading to “analysis paralysis.”

Cognitive and affective bias

Lists of cognitive and affective biases are now abundant. Wikipedia currently lists 95 cognitive biases, as well as a variety of social biases and memory errors and biases [10]. Jenicek lists over 100 cognitive biases [11]. Dobelli reviews about a 100 in his recent book [12] along with practical advice on how to deal with them in everyday life. Bias is so widespread that we need to consider it as a normal operating characteristic of the brain.

Originally perhaps, physicians may have quietly nursed some hope that they have immunity from such defects in decision making. However, cognitive biases, reasoning failures, and logical fallacies are all universal and predictable features of human brains in all cultures and there is no reason to believe physicians as a group are much different from any other. Some will persist with the notion that they are not vulnerable to these phenomena which itself constitutes bias blind spot [13]. The difficulty some people have in recognizing their own biases is one of several obstacles to a more widespread awareness of the problem (Table 1). Nevertheless, the impact of cognitive bias on clinical decision making has now been recognized in most disciplines in medicine: Anesthesia [14], Dermatology [15], Emergency Medicine [16], Medicine [17, 18], Neurology [19], Obstetrics [20], Ophthalmology [21], Pathology [22, 23], Pediatrics [24], Psychiatry [25], Radiology [26], Surgery [27], as well as medical education [28] and specialty environments such as the Intensive Care Unit [29], and Dentistry [30].

Table 1

Impediments to awareness and understanding of cognitive biases in clinical judgment.

VariableEffect
Clinical relevanceMedical undergraduates are not explicitly exposed to cognitive training in decision making. Historically, this area has not been seen as relevant to clinical performance and calibration
Lack of awarenessAlthough the lay press has heavily promoted the impact of cognitive and affective biases on everyday decision making, clinicians are generally unaware of their potential impact on medical decision making
InvulnerabilityEven where awareness does exist, physician hubris, bias blind spot, overconfidence and lack of intellectual humility may deter them from accepting that they are just as vulnerable as others to impaired judgment through bias
Myside biasA tendency to evaluate and favor information that supports one’s preconceptions and beliefs. Also known as “one sided bias“ it is a form of confirmatory bias that appears to gain strength as issues become more polarized.
Status quo biasIt is always easier for clinicians to continue to make decisions as they have done in the past. There is a prevailing tendency against learning de-biasing strategies and executing them as this requires considerably more cognitive effort and time
Vivid-Pallid dimensionCognitive and affective processes are mostly invisible and, at present, can only be inferred from outcomes or the clinician’s behavior. Descriptions of them are invariably dry, pallid, abstract and uninteresting. They typically lack the vividness and concrete nature of clinical disease presentations that are far more meaningful and appealing to the medically trained mind.

A false dichotomy

In the early 1970s the publication of work by two psychologists, Tversky and Kahneman [31, 32], heralded the heuristics and biases approach to human judgment and decision making that challenged simplistic rational models. Since then a voluminous literature has emerged providing widespread support for the existence of many heuristics and biases in human decision making. Generally, but not exclusively, they exert their influence in the intuitive mode (System 1 or through Type 1 processing), and cognitive psychologists have provided numerous experimental demonstrations of flawed decision making that result from them [33–36]. Over the last couple of decades many books on the subject, often with a view towards correcting the effects of biases, have been published in the lay press. It is something that has captured the public imagination.

Others have argued either directly against the existence of such biases [37], and/or that intuitive reasoning is a preferred mode of decision making [38–40]. Kahneman and Klein came to be seen as polar opposites on the issue and eventually attempted to resolve it through a joint publication [41], but both appear to have since reverted to their original positions [7, 42]. Polemics and a degree of rancor have arisen over this issue, much of which in retrospect appears not to have been the best investment of time. Some of the polarization appears to have arisen through misunderstandings of basic psychology theory, and a concomitant myside bias [41].

A dichotomy certainly exists in the sense that there are two distinct ways in which people arrive at decisions, but it is false to say that human decision making has to be one or the other. There are circumstances in which intuitive decision making is entirely appropriate – for the generation of ideas, when split second decisions are required, or when non-quantifiable decisions need to be made around such issues as esthetics or creative endeavors. Discoveries and new ways of thinking about issues often arise from inspirations that are not derived analytically. In contrast, building a linear particle accelerator or staging a cancer depends upon purposeful, deliberate, scientific reasoning and nothing else will do. The beauty of the human mind is that it can toggle between the two systems and select which mode is most appropriate to the task at hand; it is a dynamic process. In diagnosing illness, much of our initial contact will be with the intuitive mode allowing various possibilities to come and go. One’s intuitions and biases will vary with experience which, as Smallberg nicely puts it, is “the thumb on the scale” when weighing the options [43]. But the proper selection and verification of the final diagnosis rests with the analytic system, if indeed we follow the prudent clinician and choose to use it. Even with as simple a diagnosis as constipation, several other possibilities need to be offered up to the analytic mind before prescribing a laxative.

Going forward

A little over a decade ago, talking about cognitive bias was rare despite some notable efforts to draw attention to its role in clinical reasoning [44]. Now, we seem to have arrived at a sufficient level of awareness to examine its impact on the process of making a diagnosis, and need to consider the next steps.

We need to know more about the psychological processes that underlie human decision making and, in particular, those involved in undoing bias [45–47]. Clinical medicine needs to fully embrace recent gains in cognitive psychology to develop effective techniques for debiasing, and pursue collaborations with psychologists specialized in decision making. A significant obstacle is that psychologists do not see patients, and clinicians are usually not trained in the research methods of psychology. Importantly, the two will need to work closely with each other to ensure the ecological validity of their research design, and applicability of their findings [48].

In the meantime, we need to stay focused on how we train clinicians in decision making. Historically, medicine has put more emphasis on content knowledge than on how to think. There is now an added danger with modern technology that physicians’ cognitive calibration might be further reduced. However, some recent developments in medical education are encouraging with an increasing emphasis on critical thinking (CT) [49], and the importance of bias [50] and mindfulness [51] on clinical reasoning [52]. In education research, by far the most significant gains lie in CT interventions. A meta-analysis showed that such interventions in the age range 5–17 years had an overall positive effect size of 24 percentile points – the equivalent of moving a class from the 50th up to the 26th percentile in terms of their reasoning and problem solving skills [53]. While medical undergraduates and postgraduates are outside this age range, they are still well within the period of rapid development of CT skills [54, 55]. Critical thinking improves generally during undergraduate education [56], and other studies have shown marked improvements with specific CT interventions in post-secondary learners [57–59].

We need to accept that cognitive and affective biases are a normal part of brain function. Being able to recognize and deal with bias is the mark of a critical thinker [60] but there is more to CT than that. It involves a variety of cognitive skills such as developing accuracy, precision, relevance, depth, breadth and logic in one’s thinking, and awareness of deception, propaganda and other distorting influences on our thinking [61] Given that a significant proportion of diagnostic failure may be laid at the door of cognition, any initiative aimed at improving thinking skills in clinicians would appear worthwhile.


Corresponding author: Pat Croskerry, Dalhousie University – Critical Thinking Program, DME, 5849 University Avenue, PO Box 15000, Halifax, Nova Scotia, Canada, Phone: +1-902-494-4147, Fax: +1-902-494-2278, E-mail:

  1. Conflict of interest statement The author declares no conflict of interest.

References

1. Available from: http://en.wikipedia.org/wiki/Shamanism. Accessed on 8 September 2013.Search in Google Scholar

2. Barnes PM, Bloom B, Nahin RL. Complementary and alternative medicine use among adults and children: United States, 2007. National health statistics reports; no 12. Hyattsville, MD: National Center for Health Statistics, 2008.Search in Google Scholar

3. Friedson E. Client control and medical practice. Am J Sociol 1960;65:374–82.10.1086/222726Search in Google Scholar

4. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.10.1001/archinte.165.13.1493Search in Google Scholar PubMed

5. Zwaan L, de Bruijne M, Wagner C, Thijs A, Smits M, van der Wal G, Timmermans DR. Patient record review of the incidence, consequences, and causes of diagnostic adverse events. Arch Intern Med 2010;170:1015–21.10.1001/archinternmed.2010.146Search in Google Scholar PubMed

6. Croskerry P, Abbass A, Wu A. Emotional issues in patient safety. J Patient Safety 2010;6:1–7.10.1097/PTS.0b013e3181f6c01aSearch in Google Scholar PubMed

7. Kahneman D. Thinking fast and slow. Toronto, ON: Doubleday, 2011.Search in Google Scholar

8. Simon H. A behavioral model of rational choice, in models of man, social and rational: mathematical essays on rational human behavior in a social setting. New York: Wiley, 1957.Search in Google Scholar

9. Lakoff G, Johnson M. Philosophy in the flesh: the embodied mind and its challenge to western thought. New York: Basic Books, 1999.Search in Google Scholar

10. Available from: http://en.wikipedia.org/wiki/List_of_cognitive_biases accessed on 7 September 2013.Search in Google Scholar

11. Jenicek M. Medical error and harm: understanding, prevention and control. New York: Productivity Press, 2011.Search in Google Scholar

12. Dobelli R. The art of thinking clearly. New York: HarperCollins, 2013.Search in Google Scholar

13. Pronin E, Lin DY, Ross L. The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol B 2002;28:369–81.10.1177/0146167202286008Search in Google Scholar

14. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anaesthesiology. Br J Anaesthesiol 2012;108:229–35.10.1093/bja/aer387Search in Google Scholar PubMed

15. David CV, Chira S, Eells SJ, Ladrigan M, Papier A, Miller LG, Craft N, et al. Diagnostic accuracy in patients admitted to hospital with cellulitis. Dermatology Online J 2011;17:1.10.5070/D39GN050RRSearch in Google Scholar

16. Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184–204.10.1197/aemj.9.11.1184Search in Google Scholar

17. Croskerry P. The importance of cognitive errors in diagnosis and strategies to prevent them. Acad Med 2003;78:1–6.10.1097/00001888-200308000-00003Search in Google Scholar PubMed

18. Redelmeier D. The cognitive psychology of missed diagnoses Ann Intern Med 2005;142:115–20.10.7326/0003-4819-142-2-200501180-00010Search in Google Scholar PubMed

19. Vickrey BG, Samuels MA, Ropper AH. How neurologists think: a cognitive psychology perspective on missed diagnoses. Ann Neurol 2010;67:425–33.10.1002/ana.21907Search in Google Scholar PubMed

20. Dunphy BC, Cantwell R, Bourke S, Fleming M, Smith B, Joseph KS, et al. Cognitive elements in clinical decision-making. Toward a cognitive model for medical education and understanding clinical reasoning. Adv Health Sci Educ 2010;15:229–50.10.1007/s10459-009-9194-ySearch in Google Scholar PubMed

21. Margo CE. A pilot study in ophthalmology of inter-rater reliability in classifying diagnostic errors: an under investigated area of medical error. Qual Saf Health Care 2003;12:416–20.10.1136/qhc.12.6.416Search in Google Scholar PubMed PubMed Central

22. Foucar E. Error in anatomic pathology. Am J Clin Pathol 2001;116:S34–46.10.1309/DDKV-E4YP-CJ5Q-3M4VSearch in Google Scholar PubMed

23. Crowley RS, Legowski E, Medvedeva O, Reitmeyer K, Tseytlin E, Castine M, Jukic D, Mello-Thomas C. Automated detection of heuristics and biases among pathologists in a computer-based system. Adv Health Sci Educ 2013;18:343–63.10.1007/s10459-012-9374-zSearch in Google Scholar PubMed PubMed Central

24. Singh H, Thomas EJ, Wilson L, Kelly PA, Pietz K, Elkeeb D, Singhal G. Errors of diagnosis in pediatric practice: a multisite survey. Pediatrics 2010;126:70–9.10.1542/peds.2009-3218Search in Google Scholar PubMed PubMed Central

25. Crumlish N, Kelly BD. How psychiatrists think. Adv Psychiat Treat 2009;15:72–9.10.1192/apt.bp.107.005298Search in Google Scholar

26. Sabih D, Sabih A, Sabih Q, Khan AN. Image perception and interpretation of abnormalities; can we believe our eyes? Can we do something about it? Insight Imag 2011;2:47–55.10.1007/s13244-010-0048-1Search in Google Scholar PubMed PubMed Central

27. Shiralkar U. Smart surgeons, sharp decisions: cognitive skills to avoid errors and achieve results. Shropshire, United Kingdom: TFM Publishing, 2010.Search in Google Scholar

28. Hershberger PJ, Markert RJ, Part HM, Cohen SM, Finger WW. Understanding and addressing cognitive bias in medical education. Adv Health Sci Educ 1997;1:221–6.10.1023/A:1018372327745Search in Google Scholar

29. Gillon SA, Radford ST. Zebra in the intensive care unit: a metacognitive reflection on misdiagnosis. Crit Care Resuscitation 2012;14:216–21.Search in Google Scholar

30. Hicks EP, Kluemper GT. Heuristic reasoning and cognitive biases: are they hindrances to judgments and decision making in orthodontics? Am J Orthod Dentofac2011;139:297–304.10.1016/j.ajodo.2010.05.018Search in Google Scholar PubMed

31. Tversky A, Kahnemann D. Belief in the law of small numbers. Psychol B 1971;76:105–10.10.1037/h0031322Search in Google Scholar

32. Tversky A, Kahnemann D. Judgment under uncertainty: heuristics and biases. Science 1974;185:1124–31.10.1126/science.185.4157.1124Search in Google Scholar PubMed

33. Kahneman D, Slovic P, Tversky A. Judgment under uncertainty: heuristics and biases. Cambridge, UK: Cambridge University Press, 1982.10.1017/CBO9780511809477Search in Google Scholar

34. Plous S. The psychology of judgment and decision making. New York: McGraw-Hill, 1993.10.1037/e412982005-012Search in Google Scholar

35. Baron J. Thinking and deciding, 3rd edn. New York: Cambridge University Press, 2000.Search in Google Scholar

36. Gilovich T, Griffin D, Kahneman D. Heuristics and biases: the psychology of intuitive judgment. New York: Cambridge University Press, 2002.10.1017/CBO9780511808098Search in Google Scholar

37. Klein G, Orasanu J, Calderwood R, Zsambok CE. Decision making in action: models and methods. Norwood, NJ: Ablex Publishing Co., 1993.Search in Google Scholar

38. Klein G. The power of intuition. New York: Doubleday, 2004.Search in Google Scholar

39. Gladwell, M. Blink: the power of thinking without thinking. New York: Little, Brown and Company, 2005.Search in Google Scholar

40. Gigerenzer G. Gut feelings: the intelligence of the unconscious. New York: Viking Penguin, 2007.Search in Google Scholar

41. Kahneman D, Klein G. Conditions for intuitive expertise: a failure to disagree Am Psychol 2009;64:515–26.10.1037/a0016755Search in Google Scholar PubMed

42. Klein G. What physicians can learn from firefighters. keynote presentation at diagnostic error in medicine (DEM) Annual Conference, October 23–26, 2011 Chicago, Illinois, US.Search in Google Scholar

43. Smallberg G. Bias is the nose for the story. In: Brockman J, editor. This will make you smarter. New York: Harper Perennial, 2012:43–5.Search in Google Scholar

44. Perkins DN. Postprimary education has little impact on informal reasoning. J Exp Psychol 1985;77:562–71.10.1037/0022-0663.77.5.562Search in Google Scholar

45. Graber ML, Kissam S, Payne VL, Meyer AN, Sorensen A, Lenfestey N, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Quality and Safety 2012;21: 535–57.10.1136/bmjqs-2011-000149Search in Google Scholar PubMed

46. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Quality and Safety 2013;22:ii58–64.10.1136/bmjqs-2012-001712Search in Google Scholar PubMed PubMed Central

47. Croskerry P,Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Quality and Safety 2013;22:ii65–72.10.1136/bmjqs-2012-001713Search in Google Scholar PubMed PubMed Central

48. Croskerry P, Petrie D, Reilly M, Tait G. Deciding about fast and slow decisions. Acad Med (accepted for publication).Search in Google Scholar

49. Novella S. Your deceptive mind: a scientific guide to critical thinking skills. Chantilly VA: The Great Courses, 2012.Search in Google Scholar

50. Epstein R. Mindful practice. J Am Med Assoc 1999;282:833–9.10.1001/jama.282.9.833Search in Google Scholar PubMed

51. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimise them. Acad Med 2003;78:775–80.10.1097/00001888-200308000-00003Search in Google Scholar PubMed

52. Gay S, Bartlett M, McKinley R. Teaching clinical reasoning to medical students. The Clinical Teacher 2013;10:308–12.10.1111/tct.12043Search in Google Scholar PubMed

53. Higgins S, Hall E, Baumfield V, Moseley D. A meta-analysis of the impact of the implementation of thinking skills approaches on pupils. In: Research evidence in education library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London, 2005.Search in Google Scholar

54. Friend CM, Zubek JP. The effects of age on critical thinking ability. J Gerontol 1958;13:407–13.10.1093/geronj/13.4.407Search in Google Scholar PubMed

55. Denney NW. Critical thinking during the adult years: has the developmental function changed over the last four decades? Exp Aging Res 1995;21:191–207.10.1080/03610739508254277Search in Google Scholar PubMed

56. Lehman DR, Nisbett RE. A longitudinal study of the effects of undergraduate training on reasoning. Dev Psychol 1990;26:952–60.10.1037/0012-1649.26.6.952Search in Google Scholar

57. Solon T. Generic critical thinking infusion and course content learning in introductory psychology. J Instructional Psychol 2007;34:95–109.Search in Google Scholar

58. Bensley D, Crowe DS, Bernhardt P, Buckner C, Allman AL. Teaching and assessing critical thinking skills for argument analysis in psychology. Teach Psychol 2010;37:91–96.10.1080/00986281003626656Search in Google Scholar

59. Butler HA, Dwyer CP, Hogan MJ, Franco A, Rivas SF, Saiz C, Almeida LS, et al. The Halpern Critical Thinking Assessment and real-world outcomes: cross-national applications. Thinking Skills and Creativity 2012;7:112–21.10.1016/j.tsc.2012.04.001Search in Google Scholar

60. West RF, Toplak ME, Stanovich KE. Heuristics and biases as measures of critical thinking: associations with cognitive ability and thinking dispositions. J Educ Psychol 2008;100:930–41.10.1037/a0012842Search in Google Scholar

61. Elder L, Paul R. Critical thinking development: a stage theory with implications for instruction. Tomales, CA: Foundation for Critical Thinking, 2010.Search in Google Scholar

Received: 2013-9-20
Accepted: 2013-10-16
Published Online: 2014-01-08
Published in Print: 2014-01-01

©2014 by Walter de Gruyter Berlin/Boston

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.

Downloaded on 10.5.2024 from https://www.degruyter.com/document/doi/10.1515/dx-2013-0028/html
Scroll to top button