• Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

How We Use Abstract Thinking

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

knowledge necessary for abstract problem solving

MoMo Productions / Getty Images

  • How It Develops

Abstract thinking, also known as abstract reasoning, involves the ability to understand and think about complex concepts that, while real, are not tied to concrete experiences, objects, people, or situations.

Abstract thinking is considered a type of higher-order thinking, usually about ideas and principles that are often symbolic or hypothetical. This type of thinking is more complex than the type of thinking that is centered on memorizing and recalling information and facts.

Examples of Abstract Thinking

Examples of abstract concepts include ideas such as:

  • Imagination

While these things are real, they aren't concrete, physical things that people can experience directly via their traditional senses.

You likely encounter examples of abstract thinking every day. Stand-up comedians use abstract thinking when they observe absurd or illogical behavior in our world and come up with theories as to why people act the way they do.

You use abstract thinking when you're in a philosophy class or when you're contemplating what would be the most ethical way to conduct your business. If you write a poem or an essay, you're also using abstract thinking.

With all of these examples, concepts that are theoretical and intangible are being translated into a joke, a decision, or a piece of art. (You'll notice that creativity and abstract thinking go hand in hand.)

Abstract Thinking vs. Concrete Thinking

One way of understanding abstract thinking is to compare it with concrete thinking. Concrete thinking, also called concrete reasoning, is tied to specific experiences or objects that can be observed directly.

Research suggests that concrete thinkers tend to focus more on the procedures involved in how a task should be performed, while abstract thinkers are more focused on the reasons why a task should be performed.

It is important to remember that you need both concrete and abstract thinking skills to solve problems in day-to-day life. In many cases, you utilize aspects of both types of thinking to come up with solutions.

Other Types of Thinking

Depending on the type of problem we face, we draw from a number of different styles of thinking, such as:

  • Creative thinking : This involves coming up with new ideas, or using existing ideas or objects to come up with a solution or create something new.
  • Convergent thinking : Often called linear thinking, this is when a person follows a logical set of steps to select the best solution from already-formulated ideas.
  • Critical thinking : This is a type of thinking in which a person tests solutions and analyzes any potential drawbacks.
  • Divergent thinking : Often called lateral thinking, this style involves using new thoughts or ideas that are outside of the norm in order to solve problems.

How Abstract Thinking Develops

While abstract thinking is an essential skill, it isn’t something that people are born with. Instead, this cognitive ability develops throughout the course of childhood as children gain new abilities, knowledge, and experiences.

The psychologist Jean Piaget described a theory of cognitive development that outlined this process from birth through adolescence and early adulthood. According to his theory, children go through four distinct stages of intellectual development:

  • Sensorimotor stage : During this early period, children's knowledge is derived primarily from their senses.
  • Preoperational stage : At this point, children develop the ability to think symbolically.
  • Concrete operational stage : At this stage, kids become more logical but their understanding of the world tends to be very concrete.
  • Formal operational stage : The ability to reason about concrete information continues to grow during this period, but abstract thinking skills also emerge.

This period of cognitive development when abstract thinking becomes more apparent typically begins around age 12. It is at this age that children become more skilled at thinking about things from the perspective of another person. They are also better able to mentally manipulate abstract ideas as well as notice patterns and relationships between these concepts.

Uses of Abstract Thinking

Abstract thinking is a skill that is essential for the ability to think critically and solve problems. This type of thinking is also related to what is known as fluid intelligence , or the ability to reason and solve problems in unique ways.

Fluid intelligence involves thinking abstractly about problems without relying solely on existing knowledge.

Abstract thinking is used in a number of ways in different aspects of your daily life. Some examples of times you might use this type of thinking:

  • When you describe something with a metaphor
  • When you talk about something figuratively
  • When you come up with creative solutions to a problem
  • When you analyze a situation
  • When you notice relationships or patterns
  • When you form a theory about why something happens
  • When you think about a problem from another point of view

Research also suggests that abstract thinking plays a role in the actions people take. Abstract thinkers have been found to be more likely to engage in risky behaviors, where concrete thinkers are more likely to avoid risks.

Impact of Abstract Thinking

People who have strong abstract thinking skills tend to score well on intelligence tests. Because this type of thinking is associated with creativity, abstract thinkers also tend to excel in areas that require creativity such as art, writing, and other areas that benefit from divergent thinking abilities.

Abstract thinking can have both positive and negative effects. It can be used as a tool to promote innovative problem-solving, but it can also lead to problems in some cases:

  • Bias : Research also suggests that it can sometimes promote different types of bias . As people seek to understand events, abstract thinking can sometimes cause people to seek out patterns, themes, and relationships that may not exist.
  • Catastrophic thinking : Sometimes these inferences, imagined scenarios, and predictions about the future can lead to feelings of fear and anxiety. Instead of making realistic predictions, people may catastrophize and imagine the worst possible potential outcomes.
  • Anxiety and depression : Research has also found that abstract thinking styles are sometimes associated with worry and rumination . This thinking style is also associated with a range of conditions including depression , anxiety, and post-traumatic stress disorder (PTSD) .

Conditions That Impact Abstract Thinking

The presence of learning disabilities and mental health conditions can affect abstract thinking abilities. Conditions that are linked to impaired abstract thinking skills include:

  • Learning disabilities
  • Schizophrenia
  • Traumatic brain injury (TBI)

The natural aging process can also have an impact on abstract thinking skills. Research suggests that the thinking skills associated with fluid intelligence peak around the ages of 30 or 40 and begin to decline with age.

Tips for Reasoning Abstractly

While some psychologists believe that abstract thinking skills are a natural product of normal development, others suggest that these abilities are influenced by genetics, culture, and experiences. Some people may come by these skills naturally, but you can also strengthen these abilities with practice.

Some strategies that you might use to help improve your abstract thinking skills:

  • Think about why and not just how : Abstract thinkers tend to focus on the meaning of events or on hypothetical outcomes. Instead of concentrating only on the steps needed to achieve a goal, consider some of the reasons why that goal might be valuable or what might happen if you reach that goal.
  • Reframe your thinking : When you are approaching a problem, it can be helpful to purposefully try to think about the problem in a different way. How might someone else approach it? Is there an easier way to accomplish the same thing? Are there any elements you haven't considered?
  • Consider the big picture : Rather than focusing on the specifics of a situation, try taking a step back in order to view the big picture. Where concrete thinkers are more likely to concentrate on the details, abstract thinkers focus on how something relates to other things or how it fits into the grand scheme of things.

Abstract thinking allows people to think about complex relationships, recognize patterns, solve problems, and utilize creativity. While some people tend to be naturally better at this type of reasoning, it is a skill that you can learn to utilize and strengthen with practice. 

It is important to remember that both concrete and abstract thinking are skills that you need to solve problems and function successfully. 

Gilead M, Liberman N, Maril A. From mind to matter: neural correlates of abstract and concrete mindsets . Soc Cogn Affect Neurosci . 2014;9(5):638-45. doi: 10.1093/scan/nst031

American Psychological Association. Creative thinking .

American Psychological Association. Convergent thinking .

American Psychological Association. Critical thinking .

American Psychological Association. Divergent thinking .

Lermer E, Streicher B, Sachs R, Raue M, Frey D. The effect of abstract and concrete thinking on risk-taking behavior in women and men . SAGE Open . 2016;6(3):215824401666612. doi:10.1177/2158244016666127

Namkoong J-E, Henderson MD. Responding to causal uncertainty through abstract thinking . Curr Dir Psychol Sci . 2019;28(6):547-551. doi:10.1177/0963721419859346

White R, Wild J. "Why" or "How": the effect of concrete versus abstract processing on intrusive memories following analogue trauma . Behav Ther . 2016;47(3):404-415. doi:10.1016/j.beth.2016.02.004

Williams DL, Mazefsky CA, Walker JD, Minshew NJ, Goldstein G. Associations between conceptual reasoning, problem solving, and adaptive ability in high-functioning autism . J Autism Dev Disord . 2014 Nov;44(11):2908-20. doi: 10.1007/s10803-014-2190-y

Oh J, Chun JW, Joon Jo H, Kim E, Park HJ, Lee B, Kim JJ. The neural basis of a deficit in abstract thinking in patients with schizophrenia . Psychiatry Res . 2015;234(1):66-73. doi: 10.1016/j.pscychresns.2015.08.007

Hartshorne JK, Germine LT. When does cognitive functioning peak? The asynchronous rise and fall of different cognitive abilities across the life span . Psychol Sci. 2015;26(4):433-43. doi:10.1177/0956797614567339

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

ABLE blog: thoughts, learnings and experiences

  • Productivity
  • Thoughtful learning

What is abstract thinking? 10 activities to improve your abstract thinking skills

What is abstract thinking? 10 activities to improve your abstract thinking skills

Have you ever been in a meeting and proposed a unique solution to a problem? Or have you ever been faced with a difficult decision and thought about the potential consequences before making your choice?

These are examples of abstract thinking in action. Everyone uses abstract thinking in day-to-day life, but you may be wondering — what is abstract thinking?

Abstract thinking is the ability to comprehend ideas that aren't tangible or concrete. It's a crucial skill for problem-solving, creativity, and critical thinking — and the best part is that it can be developed and strengthened with practice.

In this article, we'll explore the concept of abstract thinking and offer some simple ways to become a stronger abstract thinker in everyday life. With some practice, you can become an expert problem-solver and use conceptual thinking to your advantage.

What is abstract thinking?

What is abstract thinking: model of a head and a rope

Abstract thinking is a cognitive process that allows us to think beyond observable information and deal with concepts, ideas, theories, and principles. By thinking outside of our existing knowledge, we can come up with solutions that aren't immediately obvious. This type of thinking is essential for problem-solving, decision-making, and critical thinking .

Abstract thinking enables us to generate new ideas, connect unrelated concepts, and look at the bigger picture. It also involves contemplating sentiments such as love, freedom, and compassion. These concepts aren’t concrete and can have different interpretations. By using abstract thinking, we can gain a deeper understanding of these concepts and their different meanings.

Abstract thinking is also crucial to creativity, innovation, and advanced problem-solving. It allows us to think beyond the surface level of a problem and come up with unique solutions. This can be especially important in fields such as science and technology, where new breakthroughs often require fresh perspectives and innovative thinking.

In addition, abstract thinking is a vital skill for personal development, enabling us to think beyond our immediate environment and beliefs and consider different perspectives. This allows individuals to make better decisions, be more receptive and open to change, and be more creative.

knowledge necessary for abstract problem solving

Be the first to try it out!

We're developing ABLE, a powerful tool for building your personal knowledge, capturing information from the web, conducting research, taking notes, and writing content.

Abstract vs. concrete thinking

We can best understand abstract thinking by knowing what it's not — concrete thinking. Concrete thinking is understanding and processing observable and directly experienced information. It's often associated with basic sensory and perceptual processes, such as recognizing a familiar face or identifying a physical object by its shape.

On the other hand, abstract thinking is the ability to understand and process information that isn’t directly observable or experienced. Abstract thinking is often associated with higher-level cognitive processes, such as decision making and critical thinking.

For example, if you’re asked what a chair looks like, concrete thinking would involve picturing it and what it's typically used for. By contrast, abstract thinking would involve considering what a chair could symbolize or how it could be used differently than what is traditionally accepted.

The two types of thinking aren’t mutually exclusive — instead, they complement each other in the cognitive process. We need concrete and abstract thinking skills to effectively process information and make informed decisions.

How is abstract thinking developed?

What is abstract thinking: model of a brain rocket on a yellow background

Abstract thinking is a cognitive process that develops over time, beginning in childhood and continuing into adulthood. The psychologist Jean Piaget , known for his theory of cognitive development, proposed that children go through different stages of mental growth. This begins with the sensorimotor stage, in which infants and young children learn through their senses and motor skills and develop concrete thinking skills. In their later years, they develop more advanced cognitive abilities, including abstract thinking.

During childhood, abstract thinking develops as children use the cognitive approach to learning to grasp new concepts and skills. They start to understand and manipulate abstract concepts such as numbers, time, and cause and effect. As they observe the world around them, they use what they know to make sense of what is happening and explore other possibilities.

A learning disability, mental health condition, or brain injury can, however, affect abstract thinking. Among these are psychological illnesses like schizophrenia , developmental disorders like autism, ADHD, and dyslexia, and physical illnesses like stroke, dementia, and traumatic brain injury. These individuals may have difficulty understanding and manipulating abstract concepts and require additional support to develop their abstract thinking skills.

As adults, we continue to refine our abstract thinking skills through practice. We can become adept at problem-solving and critical thinking by regularly engaging in activities that require abstract thought. These activities include brainstorming, reading, writing, playing board games, and exploring creative projects. Factors such as experience, education, and environment all play a role in the development of abstract thinking, and it's essential to continue challenging and exercising our cognitive learning skills to maintain and improve abstract thinking.

Why is it important to learn to think abstractly?

Thinking abstractly is a crucial skill that allows us to go beyond surface-level understanding and interpret the deeper meaning of concepts, ideas, and information. It enables us to see the big picture and make connections between seemingly unrelated ideas, which is a crucial thinking tool for problem solving and critical thinking. Additionally, learning to think abstractly can bring numerous benefits in our daily lives and in various fields such as science, technology, engineering, and mathematics (STEM).

For instance, abstract thinking enables us to process information quickly and efficiently on a daily basis. It helps us understand and interpret what people are saying and what is happening around us, which can lead to better decision-making. Abstract thinking is vital in STEM fields for innovation and progress, as it encourages creative thinking and the exploration of new ideas and perspectives.

Furthermore, abstract thinking helps us understand abstract concepts such as justice, freedom, and patriotism. By using analogies and other tools, we can consider what these words stand for, their implications in our world, and how they can be applied effectively in day-to-day life. In this way, abstract thinking helps us make sense of complex ideas and concepts and enables us to navigate the world with greater insight and understanding.

10 tips to improve your abstract thinking skills

Hanging light bulbs on a pink background

Abstract thinking is crucial for problem-solving, creativity, and critical thinking. Fortunately, there are many ways to improve these skills in your everyday life.

1. Incorporate puzzles into your life

Solving puzzles is a great way to practice abstract reasoning and exercise your brain. Whether you enjoy crosswords, Sudoku, or jigsaw puzzles, solving these types of problems improves your ability to think abstractly by requiring you to think critically and strategically to find solutions to issues that aren’t immediately obvious.

2. Learn something new

Your mind engages in the information processing cycle when learning new things. Learning something new allows you to explore different perspectives and understand how the world works. You'll gain new knowledge and practice your abstract thinking skills as you process, store, and recall what you’ve learned.

3. Explore your creativity

Creative expression is another excellent way to exercise your abstract thinking skills. Creativity engages the right side of the brain , which is responsible for abstract thinking and creative problem-solving. Through drawing, painting, writing, or photography, exploring the creative process encourages you to think outside the box and develop new ideas.

4. Practice mindfulness

Mindfulness is the practice of purposely observing the present moment without judgment or bias. Practicing mindfulness can help you improve your abstract thinking by teaching you how to observe your thoughts, feelings, and emotions objectively and without judgment. As you think more deeply and analytically about what's happening in the present moment, you will further develop your abstract thinking skills.

5. Make a habit of reading

Top view of a book

Books and articles on various topics can help you build your understanding of complex concepts and ideas. Reading enables you to develop your ability to connect different ideas and think critically about the material. You also have to use your imagination to visualize what you're reading, which helps to improve your creative thinking abilities. Annotating your reading can step this up a notch.

6. Travel somewhere new

Traveling to new places exposes you to new cultures and ways of thinking, which can help to expand your mind and improve your abstract thinking skills. Plus, when you're in a new place, you're forced to think on your feet as you figure out how to navigate the unfamiliar landscape. This helps to build up your problem-solving skills, which are essential for developing abstract thinking abilities.

7. Get more exercise

Exercise is not only beneficial for your physical health, but it can also be beneficial for your mental health . Exercise helps to increase oxygen flow to the brain, which can improve cognitive functioning and help you think more clearly. Exercise also increases the production of endorphins, which can improve your mood and make it easier to focus on what you're doing.

8. Practice critical thinking

Critical thinking involves using your reasoning skills to evaluate information objectively. By practicing critical thinking, you can develop your abstract thinking ability by learning to analyze information, identify patterns and connections, and draw logical conclusions. Additionally, critical thinking will help you become more aware of your own biases so that you can make unbiased decisions.

9. Embrace risk-taking

Taking risks and engaging in activities that make you uncomfortable can help you practice abstract thinking. Stepping outside of your comfort zone forces you to think differently and create solutions to complex problems. It also requires you to push yourself beyond what is familiar and take a leap of faith as you learn new things .

10. Take up a new hobby

Hobbies like painting, sculpting, and photography can help you practice abstract thinking by allowing you to explore new ideas and ways of looking at the world. These activities also require you to use your imagination and creativity to devise solutions that aren’t immediately obvious. It also makes you feel accomplished when you're done, which can boost your confidence and make you more open to taking risks in other aspects of life.

Enhance your abstract thinking skills

If you've wondered, "What is abstract thinking?" now you have a better understanding. Abstract thinking skills can benefit us in many areas. From problem solving to meaningful learning to critical thinking, it's a powerful tool that can enhance our ability to navigate daily challenges.

By incorporating activities that promote the abstract thinking process into our daily routine, we can improve our ability to grasp abstract ideas, improve our decision-making skills, and see the bigger picture. With practice and dedication, we can master the art of abstract thinking and unlock its full potential.

I hope you have enjoyed reading this article. Feel free to share, recommend and connect 🙏

Connect with me on Twitter 👉   https://twitter.com/iamborisv

And follow Able's journey on Twitter: https://twitter.com/meet_able

And subscribe to our newsletter to read more valuable articles before it gets published on our blog.

Now we're building a Discord community of like-minded people, and we would be honoured and delighted to see you there.

Erin E. Rupp

Erin E. Rupp

Read more posts by this author

5 examples of cognitive learning theory (and how you can use them)

Managing multiple tabs: how able helps you tackle tab clutter.

5 examples of cognitive learning theory (and how you can use them)

Learning styles for adults: 5 ways to maximize your potential

0 results found.

  • Aegis Alpha SA
  • We build in public

Building with passion in

Psychologily

Abstract Thinking

What is Abstract Thinking? Understanding the Power of Creative Thought

When we think about thinking, we usually imagine it as a straightforward process of weighing options and making decisions. However, there is a more complex and abstract thinking type. Abstract thinking involves understanding and thinking about complex concepts not tied to concrete experiences, objects, people, or situations.

Abstract thinking is a type of higher-order thinking that usually deals with ideas and principles that are often symbolic or hypothetical. It is the ability to think about things that are not physically present and to look at the broader significance of ideas and information rather than the concrete details. Abstract thinkers are interested in the deeper meaning of things and the bigger picture. They can see patterns and connections between seemingly unrelated concepts and ideas. For example, when we listen to a piece of music, we may feel a range of emotions that are not directly related to the lyrics or melody. Abstract thinkers can understand and appreciate the complex interplay of elements that create this emotional response.

Understanding Abstract Thinking

Humans can think about concepts and ideas that are not physically present. This is known as abstract thinking. It is a type of higher-order thinking that involves processing often symbolic or hypothetical information.

Defining Abstract Thinking

Abstract thinking is a cognitive skill that allows us to understand complex ideas, make connections between seemingly unrelated concepts, and solve problems creatively. It is a way of thinking not tied to specific examples or situations. Instead, it involves thinking about the broader significance of ideas and information.

Abstract thinking differs from concrete thinking, which focuses on memorizing and recalling information and facts. Concrete thinking is vital for understanding the world, but abstract thinking is essential for problem-solving, creativity, and critical thinking.

Origins of Abstract Thinking

The origins of abstract thinking are partially clear, but it is believed to be a uniquely human ability. Some researchers believe that abstract thinking results from language and symbolic thought development. Others believe that it results from our ability to imagine and visualize concepts and ideas.

Abstract thinking is an essential skill that can be developed and strengthened with practice regardless of its origins. By learning to think abstractly, we can expand our understanding of the world and develop new solutions to complex problems.

Abstract thinking is a higher-order cognitive skill that allows us to think about concepts and ideas that are not physically present. We can improve our problem-solving, creativity, and critical thinking skills by developing our abstract thinking ability.

Importance of Abstract Thinking

Abstract thinking is a crucial skill that significantly impacts our daily lives. It allows us to understand complex concepts and think beyond what we see or touch. This section will discuss the benefits of abstract thinking in our daily lives and its role in problem-solving.

Benefits in Daily Life

Abstract thinking is essential for our personal growth and development. It enables us to think critically and creatively, which is necessary for making informed decisions. When we think abstractly, we can understand complex ideas and concepts, which helps us communicate more effectively with others.

Abstract thinking also helps us to be more adaptable and flexible in different situations. We can see things from different perspectives and find innovative solutions to problems. This skill is beneficial in today’s fast-paced world, where change is constant, and we need to adapt quickly.

Role in Problem Solving

Abstract thinking plays a crucial role in problem-solving. It allows us to approach problems from different angles and find creative solutions. When we can think abstractly, we can see the bigger picture and understand the underlying causes of a problem.

By using abstract thinking, we can also identify patterns and connections that may not be immediately apparent. This helps us to find solutions that are not only effective but also efficient. For example, a business owner who can think abstractly can identify the root cause of a problem and develop a solution that addresses it rather than just treating the symptoms.

Abstract thinking is a valuable skill with many benefits in our daily lives. It allows us to think critically and creatively, be more adaptable and flexible, and find innovative solutions to problems. By developing our abstract thinking skills, we can improve our personal and professional lives and positively impact the world around us.

Abstract Thinking Vs. Concrete Thinking

When it comes to thinking, we all have different approaches. Some of us tend to think more abstractly, while others tend to think more concretely. Abstract thinking and concrete thinking are two different styles of thought that can influence how we perceive and interact with the world around us.

Key Differences

The key difference between abstract and concrete thinking is the level of specificity involved in each style. Concrete thinking focuses on a situation’s immediate and tangible aspects, whereas abstract thinking is more concerned with the big picture and underlying concepts.

Concrete thinking is often associated with literal interpretations of information, while abstract thinking relates to symbolic and metaphorical interpretations. For example, if we describe a tree, someone who thinks concretely might describe its physical appearance and characteristics. In contrast, someone who thinks abstractly might explain its symbolic significance in nature.

The transition from Concrete to Abstract

While some people may naturally lean towards one style of thinking over the other, it is possible to transition from concrete to abstract thinking. This can be particularly useful in problem-solving and critical-thinking situations, where a more abstract approach may be needed to find a solution.

One way to make this transition is to focus on a situation’s underlying concepts and principles rather than just the immediate details. This can involve asking questions that explore the broader implications of a situation or looking for patterns and connections between seemingly unrelated pieces of information.

Abstract and concrete thinking are two different styles of thought that can influence how we perceive and interact with the world around us. While both styles have their strengths and weaknesses, transitioning between them can be valuable in many areas of life.

Development of Abstract Thinking

As we grow and learn, our ability to think abstractly develops. Age and education are two major factors that influence the development of abstract thinking.

Influence of Age

As we age, our ability to think abstractly improves. This is due to the development of our brain and cognitive abilities. According to Piaget’s theory of cognitive development , children progress through four stages of cognitive development, with the final stage being the formal operational stage. This stage is characterized by the ability to think abstractly and logically about hypothetical situations and concepts.

Role of Education

Education also plays a significant role in the development of abstract thinking. Through education, we are exposed to new ideas, concepts, and theories that challenge our existing knowledge and encourage us to think abstractly. Education also gives us the tools and skills to analyze and evaluate complex information and ideas.

In addition to traditional education, engaging in activities promoting abstract thinking can be beneficial. For example, participating in debates, solving puzzles, and playing strategy games can all help improve our abstract thinking skills.

The development of abstract thinking is a complex process influenced by age and education. By continually challenging ourselves to think abstractly and engaging in activities that promote abstract thinking, we can continue to improve our cognitive abilities and expand our knowledge and understanding of the world around us.

Challenges in Abstract Thinking

Abstract thinking can be a challenging cognitive process, especially for those not used to it. Here are some common misunderstandings and difficulties people may encounter when thinking abstractly.

Common Misunderstandings

One common misunderstanding about abstract thinking is that it is the same as creative thinking. While creativity can certainly involve abstract thinking, the two are not interchangeable. Abstract thinking consists of understanding and thinking about complex concepts not tied to concrete experiences, objects, people, or situations. Creative thinking, on the other hand, involves coming up with new and innovative ideas.

Another common misunderstanding is that abstract thinking is only helpful for people in certain fields, such as science or philosophy. Abstract thinking can benefit many different areas of life, from problem-solving at work to understanding complex social issues.

Overcoming Difficulties

One difficulty people may encounter when thinking abstractly is a lack of concrete examples or experiences to draw from. To overcome this, finding real-world examples of the concepts you are trying to understand can be helpful. For example, if you are trying to understand the concept of justice, you might look for examples of situations where justice was served or not served.

Another challenge people may encounter is focusing too much on details and needing more on the bigger picture. To overcome this, try to step back and look at the broader significance of the ideas and information you are working with. This can involve asking yourself questions like “What is the main point here?” or “How does this fit into the larger context?”

Abstract thinking can be a challenging but valuable cognitive process. By understanding common misunderstandings and overcoming difficulties, we can develop our ability to think abstractly and apply it in various aspects of our lives.

Frequently Asked Questions

How does abstract thinking differ from concrete thinking.

Abstract thinking is a type of thinking that involves the ability to think about concepts, ideas, and principles that are not necessarily tied to physical objects or experiences. Concrete thinking, on the other hand, is focused on the here and now, and is more concerned with the physical world and immediate experiences.

What are some examples of abstract thinking?

Examples of abstract thinking include the ability to understand complex ideas, to think creatively, to solve problems, to think critically, and to engage in philosophical discussions.

What is the significance of abstract thinking in psychiatry?

Abstract thinking is an important component of mental health and well-being. It allows individuals to think beyond the present moment and to consider different possibilities and outcomes. In psychiatry, the ability to engage in abstract thinking is often used as an indicator of cognitive functioning and overall mental health.

At what age does abstract thinking typically develop?

Abstract thinking typically develops during adolescence, around the age of 12 or 13. However, the ability to engage in abstract thinking can continue to develop throughout adulthood, with continued practice and exposure to new ideas and experiences.

What are the stages of abstract thought according to Piaget?

According to Piaget, there are four stages of abstract thought: the sensorimotor stage (birth to 2 years), the preoperational stage (2 to 7 years), the concrete operational stage (7 to 12 years), and the formal operational stage (12 years and up). During the formal operational stage, individuals are able to engage in abstract thinking and to think about hypothetical situations and possibilities.

What are some exercises to improve abstract thinking skills?

Some exercises that can help improve abstract thinking skills include engaging in philosophical discussions, solving puzzles and brain teasers, playing strategy games, and engaging in creative activities such as writing or painting. Additionally, exposing oneself to new ideas and experiences can help broaden one’s perspective and improve abstract thinking abilities.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Abstract Reasoning

Abstract reasoning: understanding the essence of cognitive ability.

Abstract Reasoning is a fundamental cognitive ability that plays a crucial role in problem-solving, decision-making, and logical thinking. It is the mental capability to analyze and comprehend abstract patterns, relationships, and concepts, without the need for any prior specific knowledge. This cognitive skill enables individuals to think critically, make connections, and draw inferences from complex information.

At its core, Abstract Reasoning involves the ability to identify and interpret patterns, similarities, and differences within non-verbal or symbolic data sets. It requires individuals to go beyond concrete information and grasp the underlying logic or principles that govern a given scenario. Unlike other cognitive abilities that rely heavily on factual or acquired knowledge, Abstract Reasoning focuses on intuitive and creative thinking.

Abstract Reasoning assessments often present candidates with sequences of shapes, symbols, or patterns. By analyzing these abstract elements, individuals are required to perceive the inherent relationships and make predictions based on the logic governing the sequence. This ability to discern patterns and relationships allows individuals to solve problems in novel and unfamiliar situations.

Mastering Abstract Reasoning is crucial in various fields, such as mathematics, science, engineering, and computer programming. It empowers individuals to solve complex problems, devise innovative solutions, and make informed decisions based on logical deductions.

Why Assess a Candidate's Abstract Reasoning Skill Level?

Assessing a candidate's Abstract Reasoning skill level is essential for organizations looking to hire top talent and create high-performing teams. Here are several key reasons why evaluating Abstract Reasoning abilities during the hiring process is crucial:

Problem-Solving Aptitude : Abstract Reasoning provides valuable insights into a candidate's problem-solving abilities. By evaluating their proficiency in recognizing and understanding complex patterns, organizations can gauge an individual's capacity to approach and solve intricate problems creatively and efficiently.

Critical Thinking : Abstract Reasoning is closely tied to critical thinking skills. Candidates with strong Abstract Reasoning skills demonstrate the ability to analyze information objectively, make logical connections, and draw accurate conclusions. Assessing this cognitive ability helps identify individuals who can think critically and make sound judgments.

Adaptability and Learning Potential : Abstract Reasoning reflects an individual's ability to adapt and learn new concepts quickly. Candidates who excel in this cognitive skill exhibit a high capacity for abstract thinking, which enables them to grasp new information and adapt to changing circumstances effectively. Assessing Abstract Reasoning helps identify candidates who have the potential to learn and adapt in dynamic work environments.

Innovation and Creativity : Abstract Reasoning is closely associated with innovation and creativity. Candidates who possess strong Abstract Reasoning abilities are more likely to come up with innovative solutions, think outside the box, and generate fresh ideas. Assessing this skill can help identify individuals who can contribute to a company's innovation and creativity initiatives.

Analytical Decision-Making : Abstract Reasoning is instrumental in making analytical decisions based on incomplete or insufficient data. By assessing a candidate's Abstract Reasoning skill level, organizations can identify individuals who can make informed decisions in complex and ambiguous situations, reducing the risk of hasty or impulsive decision-making.

Assessing a candidate's Abstract Reasoning skill level is crucial to building a capable and versatile workforce. By evaluating this cognitive ability, organizations can identify individuals with strong problem-solving skills, critical thinking abilities, adaptability, innovation potential, and analytical decision-making capabilities. A comprehensive assessment of Abstract Reasoning helps organizations identify top talent and build teams capable of driving success and growth.

Assessing a Candidate's Abstract Reasoning Skill Level with Alooba

When it comes to evaluating a candidate's Abstract Reasoning skill level, Alooba provides a comprehensive and efficient solution for companies seeking top talent. With our advanced assessment platform, you can accurately measure and assess candidates' Abstract Reasoning abilities. Here's how Alooba can help:

Tailored Abstract Reasoning Assessments : Alooba offers specialized Abstract Reasoning assessments that are specifically designed to measure a candidate's proficiency in this cognitive ability. Our assessments present candidates with abstract patterns and sequences, allowing them to showcase their skills in identifying relationships, discerning patterns, and making logical inferences.

Diverse Test Formats : Alooba provides a range of test formats that suit different recruitment needs. Whether it's multiple-choice tests, data analysis exercises, or coding tasks, our platform allows you to select the most relevant test format to evaluate a candidate's Abstract Reasoning skills effectively.

Customizable Assessments : With Alooba, you have the flexibility to customize assessments according to your specific requirements. Tailor the assessments to align with your industry, job role, or organizational needs, ensuring that you evaluate Abstract Reasoning skills in a way that is most relevant to your hiring process.

Objective Evaluation Process : Alooba's assessments are autograded, ensuring an objective evaluation of candidates' Abstract Reasoning skills. This eliminates human bias and provides consistent results, allowing you to make fair and informed decisions based on candidates' performance.

Efficient Screening Tool : Save time and resources by utilizing Alooba's Abstract Reasoning assessments as a screening tool. Identify candidates who demonstrate strong Abstract Reasoning abilities early in the hiring process, enabling you to focus on top talent and streamline your recruitment efforts effectively.

Comprehensive Reporting and Insights : Alooba provides detailed reports and insights on candidates' performance in Abstract Reasoning assessments. Gain a deeper understanding of candidates' strengths and areas for improvement, empowering you to make data-driven hiring decisions and identify the most suitable candidates for your organization.

By leveraging Alooba's assessment platform, you can confidently assess and measure candidates' Abstract Reasoning skill level. Our tailored assessments, customizable options, objective evaluation process, and comprehensive reporting capabilities ensure that you identify top talent proficient in Abstract Reasoning. Take advantage of Alooba to streamline your recruitment process and access the best candidates for your organization's success.

Subtopics Within Abstract Reasoning

Abstract Reasoning encompasses various subtopics that allow individuals to demonstrate their cognitive abilities in different ways. Here are some key areas that fall under the umbrella of Abstract Reasoning:

Pattern Recognition : Abstract Reasoning involves the ability to recognize patterns within abstract sequences or shapes. Candidates are assessed on their capability to identify recurring patterns, anticipate the next element, and comprehend the underlying logic behind the sequence.

Analogies : Analogical reasoning is another aspect of Abstract Reasoning. It requires individuals to make connections between different sets of elements based on shared characteristics or relationships. Candidates are evaluated on their proficiency in identifying and extending analogies, highlighting their ability to uncover similarities and apply them to new situations.

Logical Deduction : Abstract Reasoning also incorporates logical deduction skills. Candidates are presented with a series of statements or propositions, and they must draw conclusions based on the given information. This subtopic assesses individuals' capacity to think logically, follow premises, and arrive at valid inferences.

Spatial Visualization : Spatial visualization involves mentally manipulating and manipulating abstract shapes and objects. Candidates are tested on their ability to rotate, transform, or mentally manipulate images to understand spatial relationships. This skill is especially important in fields such as architecture, design, and engineering.

Inductive Reasoning : Inductive reasoning is the process of drawing general conclusions based on specific observations or patterns. In Abstract Reasoning assessments, candidates are presented with a set of elements or patterns, and they must identify the underlying rule or principle that governs the set. This subtopic evaluates individuals' ability to make logical generalizations and extrapolate information.

Metaphorical Reasoning : Metaphorical reasoning involves recognizing and understanding metaphorical or symbolic representations. Candidates may be presented with abstract symbols or shapes and asked to interpret the underlying meaning or symbolic representation. This subtopic assesses individuals' ability to think beyond literal interpretations and make connections between abstract concepts.

By exploring these subtopics within Abstract Reasoning, candidates can showcase their proficiency in various cognitive abilities, including pattern recognition, analogical reasoning, logical deduction, spatial visualization, inductive reasoning, and metaphorical reasoning. Assessing candidates' performance in these specific areas helps organizations gain insights into their Abstract Reasoning skills and evaluate their suitability for roles that require strong cognitive abilities.

Practical Applications of Abstract Reasoning

Abstract Reasoning is a cognitive ability with practical applications across various domains. Here are some ways in which Abstract Reasoning is used:

Problem-Solving : Abstract Reasoning is crucial for problem-solving. It enables individuals to analyze complex information, identify patterns, and recognize underlying relationships. By applying Abstract Reasoning skills, individuals can approach problems creatively and devise innovative solutions.

Decision-Making : Abstract Reasoning plays a vital role in decision-making processes. It allows individuals to evaluate multiple options, consider various factors, and make logical deductions to arrive at informed decisions. Abstract Reasoning helps minimize biases and facilitates rational decision-making.

Research and Analysis : In research and analysis fields, Abstract Reasoning is highly valuable. It aids in identifying trends, uncovering insights, and drawing conclusions from large datasets or complex information. Abstract Reasoning skills enable researchers and analysts to see beyond surface-level information and make connections that lead to meaningful discoveries.

Design and Creativity : Abstract Reasoning is instrumental in design and creative endeavors. It helps individuals recognize patterns, visualize concepts, and develop innovative ideas. Abstract Reasoning allows designers to think critically, explore unconventional approaches, and create aesthetically pleasing and functional solutions.

Scientific and Mathematical Reasoning : Abstract Reasoning is foundational in scientific and mathematical disciplines. It enables individuals to understand complex theories, formulate hypotheses, and conduct experiments. Abstract Reasoning skills are essential for making connections between different scientific phenomena and analyzing mathematical concepts.

Technology and Engineering : In the technology and engineering fields, Abstract Reasoning is vital for problem-solving and innovation. It supports individuals in understanding intricate systems, identifying bottlenecks, and devising efficient solutions. Abstract Reasoning helps engineers and technologists think analytically and develop novel technologies and systems.

Critical Thinking and Analysis : Abstract Reasoning is closely linked to critical thinking and analysis. It helps individuals evaluate information objectively, identify logical inconsistencies, and dissect complex problems into manageable components. Abstract Reasoning enables individuals to think critically, ask probing questions, and arrive at well-reasoned conclusions.

Understanding the practical applications of Abstract Reasoning provides valuable insights into its significance in professional settings. Through problem-solving, decision-making, research, design, scientific reasoning, technology, and critical thinking, Abstract Reasoning contributes to improved outcomes and innovative solutions across various industries and disciplines.

Roles Requiring Strong Abstract Reasoning Skills

Several roles rely heavily on individuals with strong Abstract Reasoning skills. These roles require the ability to analyze complex patterns, think critically, and solve problems creatively. Here are some examples of roles on Alooba that benefit from good Abstract Reasoning skills:

Data Analyst : Data Analysts need to work with large datasets, identify trends, and uncover insights. Strong Abstract Reasoning skills help them recognize patterns within data, make connections, and draw meaningful conclusions.

Data Scientist : Data Scientists leverage Abstract Reasoning skills to discover patterns in data, build predictive models, and generate actionable insights. They apply complex algorithms and statistical techniques to solve problems and make data-driven decisions.

Insights Analyst : Insights Analysts analyze consumer behavior, market trends, and business metrics. Abstract Reasoning skills enable them to identify patterns, draw correlations, and provide valuable insights to support strategic decision-making.

Marketing Analyst : Marketing Analysts use Abstract Reasoning skills to analyze market trends, consumer preferences, and campaign performance. They connect marketing data points, find meaningful insights, and provide actionable recommendations for marketing strategies.

Product Analyst : Product Analysts rely on Abstract Reasoning skills to analyze user behavior, identify UX patterns, and evaluate product performance. They generate insights that drive product enhancements and maximize user satisfaction.

Analytics Engineer : Analytics Engineers design and develop data pipelines, build data models, and ensure data quality. They require Abstract Reasoning skills to understand complex relationships, optimize data flows, and create efficient analytical solutions.

Data Architect : Data Architects design and manage data structures and storage systems. Abstract Reasoning skills are crucial in modeling complex data relationships, ensuring data integrity, and creating efficient databases.

Data Pipeline Engineer : Data Pipeline Engineers build and manage data pipelines, transforming and transferring data across systems. Abstract Reasoning skills help them design efficient data workflows, identify bottlenecks, and optimize pipeline performance.

Deep Learning Engineer : Deep Learning Engineers develop and implement complex machine learning models. Abstract Reasoning skills enable them to understand intricate model architectures, tune hyperparameters, and optimize model performance.

Demand Analyst : Demand Analysts analyze market demand patterns, forecast sales, and optimize inventory. They leverage Abstract Reasoning skills to identify demand trends, interpret market signals, and make accurate forecasts.

DevOps Engineer : DevOps Engineers use Abstract Reasoning skills to automate processes, design scalable infrastructure, and optimize system performance. They analyze system architectures, identify dependencies, and ensure smooth deployment and operation.

These roles highlight the importance of strong Abstract Reasoning skills in various fields such as data analysis, marketing, product management, and engineering. By honing these skills, professionals can excel in their respective domains and make valuable contributions to their organizations.

Associated Roles

Analytics engineer.

Analytics Engineers are responsible for preparing data for analytical or operational uses. These professionals bridge the gap between data engineering and data analysis, ensuring data is not only available but also accessible, reliable, and well-organized. They typically work with data warehousing tools, ETL (Extract, Transform, Load) processes, and data modeling, often using SQL, Python, and various data visualization tools. Their role is crucial in enabling data-driven decision making across all functions of an organization.

Data Analyst

Data Analysts draw meaningful insights from complex datasets with the goal of making better decisions. Data Analysts work wherever an organization has data - these days that could be in any function, such as product, sales, marketing, HR, operations, and more.

Data Architect

Data Architects are responsible for designing, creating, deploying, and managing an organization's data architecture. They define how data is stored, consumed, integrated, and managed by different data entities and IT systems, as well as any applications using or processing that data. Data Architects ensure data solutions are built for performance and design analytics applications for various platforms. Their role is pivotal in aligning data management and digital transformation initiatives with business objectives.

Data Governance Analyst

Data Governance Analysts play a crucial role in managing and protecting an organization's data assets. They establish and enforce policies and standards that govern data usage, quality, and security. These analysts collaborate with various departments to ensure data compliance and integrity, and they work with data management tools to maintain the organization's data framework. Their goal is to optimize data practices for accuracy, security, and efficiency.

Data Pipeline Engineer

Data Pipeline Engineers are responsible for developing and maintaining the systems that allow for the smooth and efficient movement of data within an organization. They work with large and complex data sets, building scalable and reliable pipelines that facilitate data collection, storage, processing, and analysis. Proficient in a range of programming languages and tools, they collaborate with data scientists and analysts to ensure that data is accessible and usable for business insights. Key technologies often include cloud platforms, big data processing frameworks, and ETL (Extract, Transform, Load) tools.

Data Scientist

Data Scientists are experts in statistical analysis and use their skills to interpret and extract meaning from data. They operate across various domains, including finance, healthcare, and technology, developing models to predict future trends, identify patterns, and provide actionable insights. Data Scientists typically have proficiency in programming languages like Python or R and are skilled in using machine learning techniques, statistical modeling, and data visualization tools such as Tableau or PowerBI.

Deep Learning Engineer

Deep Learning Engineers’ role centers on the development and optimization of AI models, leveraging deep learning techniques. They are involved in designing and implementing algorithms, deploying models on various platforms, and contributing to cutting-edge research. This role requires a blend of technical expertise in Python, PyTorch or TensorFlow, and a deep understanding of neural network architectures.

Demand Analyst

Demand Analysts specialize in predicting and analyzing market demand, using statistical and data analysis tools. They play a crucial role in supply chain management, aligning product availability with customer needs. This involves collaborating with sales, marketing, and production teams, and utilizing CRM and BI tools to inform strategic decisions.

DevOps Engineer

DevOps Engineers play a crucial role in bridging the gap between software development and IT operations, ensuring fast and reliable software delivery. They implement automation tools, manage CI/CD pipelines, and oversee infrastructure deployment. This role requires proficiency in cloud platforms, scripting languages, and system administration, aiming to improve collaboration, increase deployment frequency, and ensure system reliability.

Insights Analyst

Insights Analysts play a pivotal role in transforming complex data sets into actionable insights, driving business growth and efficiency. They specialize in analyzing customer behavior, market trends, and operational data, utilizing advanced tools such as SQL, Python, and BI platforms like Tableau and Power BI. Their expertise aids in decision-making across multiple channels, ensuring data-driven strategies align with business objectives.

Marketing Analyst

Marketing Analysts specialize in interpreting data to enhance marketing efforts. They analyze market trends, consumer behavior, and campaign performance to inform marketing strategies. Proficient in data analysis tools and techniques, they bridge the gap between data and marketing decision-making. Their role is crucial in tailoring marketing efforts to target audiences effectively and efficiently.

Product Analyst

Product Analysts utilize data to optimize product strategies and enhance user experiences. They work closely with product teams, leveraging skills in SQL, data visualization (e.g., Tableau), and data analysis to drive product development. Their role includes translating business requirements into technical specifications, conducting A/B testing, and presenting data-driven insights to inform product decisions. Product Analysts are key in understanding customer needs and driving product innovation.

Other names for Abstract Reasoning include Inductive Reasoning , Logical Reasoning , and Diagrammatic Reasoning .

Ready to Assess Abstract Reasoning Skills and More?

Book a Discovery Call with Alooba Today

Discover how Alooba's advanced assessment platform can help you evaluate candidates' Abstract Reasoning skills alongside other essential capabilities. Streamline your hiring process and identify top talent effortlessly.

Over 50,000 Candidates Can't Be Wrong

{feedback.Name} Avatar

Our Customers Say

I was at WooliesX (Woolworths) and we used Alooba and it was a highly positive experience. We had a large number of candidates. At WooliesX, previously we were quite dependent on the designed test from the team leads. That was quite a manual process. We realised it would take too much time from us. The time saving is great. Even spending 15 minutes per candidate with a manual test would be huge - hours per week, but with Alooba we just see the numbers immediately.

Shen Liu , Logickube ( Principal at Logickube )

We get a high flow of applicants, which leads to potentially longer lead times, causing delays in the pipelines which can lead to missing out on good candidates. Alooba supports both speed and quality. The speed to return to candidates gives us a competitive advantage. Alooba provides a higher level of confidence in the people coming through the pipeline with less time spent interviewing unqualified candidates.

Scott Crowe , Canva ( Lead Recruiter - Data )

How can you accurately assess somebody's technical skills, like the same way across the board, right? We had devised a Tableau-based assessment. So it wasn't like a past/fail. It was kind of like, hey, what do they send us? Did they understand the data or the values that they're showing accurate? Where we'd say, hey, here's the credentials to access the data set. And it just wasn't really a scalable way to assess technical - just administering it, all of it was manual, but the whole process sucked!

Cole Brickley , Avicado ( Director Data Science & Business Intelligence )

The diversity of our pool has definitely improved so we just have many more candidates from just different backgrounds which I am a huge believer in. It makes the team much better, it makes our output much better and gives us more voices in terms of building the best product and service that we can.

Piers Stobbs , Cazoo ( Chief Data Officer )

I wouldn't dream of hiring somebody in a technical role without doing that technical assessment because the number of times where I've had candidates either on paper on the CV, say, I'm a SQL expert or in an interview, saying, I'm brilliant at Excel, I'm brilliant at this. And you actually put them in front of a computer, say, do this task. And some people really struggle. So you have to have that technical assessment.

Mike Yates , The British Psychological Society ( Head of Data & Analytics )

We were very quickly quite surprised with the quality of candidates we would get from Alooba. We ended up hiring eight different analysts via Alooba in about a year's time, which is quite extraordinary for us because we actually have almost never used a recruitment agency for any role. It has been our best outsourcing solution by far.

Oz Har Adir , Vio.com ( Founder & CEO )

For data engineering & analytics these take-home assignments we were doing ourselves are a bit time consuming so we wanted to automate that and also reduce the time candidates were spending on the assessment.

Sharin Fritz , Personio ( Tech Talent Acquisition )

psychology

Abstract Thinking

Abstract thinking is a fundamental cognitive process that allows us to explore and understand concepts beyond the realm of concrete reality. It involves the ability to think conceptually, creatively, and symbolically, enabling us to grasp complex ideas, solve problems, and engage in higher-order thinking.

Abstract thinking can be defined as the mental ability to conceptualize and understand concepts that are not directly tied to physical objects or concrete events. Unlike concrete thinking that focuses on specific, tangible things, abstract thinking allows us to derive meaning, interpret symbols, make inferences, recognize patterns, and engage in metaphorical and symbolic reasoning. It is a process that goes beyond the surface-level understanding and helps us navigate the complexities of the world.

  • Interpreting a poem’s underlying meaning rather than focusing solely on its literal words.
  • Understanding the concept of justice and evaluating its application in various scenarios.
  • Recognizing and appreciating symbolism in art, literature, and music.
  • Arriving at a logical conclusion by examining multiple perspectives and possibilities.
  • Using analogies to explain complex ideas or relationships.
  • Developing and testing hypotheses in scientific experiments.

The Importance of Abstract Thinking

Abstract thinking plays a crucial role in various aspects of our lives. It is not only an essential cognitive skill but also a tool for problem-solving, decision-making, and creativity. Here are a few key areas where abstract thinking is particularly valuable:

  • Education: Abstract thinking helps students engage in critical thinking, analyze information, and delve deeper into subjects beyond surface-level knowledge. It promotes a deeper understanding of complex ideas and encourages independent thinking.
  • Problem Solving: When faced with challenges, abstract thinking allows us to generate innovative solutions by exploring unconventional possibilities and finding connections between seemingly unrelated concepts. It helps us think “outside the box” and discover new perspectives.
  • Creativity: Abstract thinking fuels creativity by allowing us to envision and create something new. Artists, musicians, writers, and inventors rely heavily on abstract thinking to generate original ideas, visualize concepts, and communicate abstract emotions or experiences.
  • Communication: Abstract thinking enhances effective communication by enabling us to convey complex ideas using metaphors, analogies, and symbolic language. It helps us express ourselves more vividly and engage listeners or readers on a deeper, emotional level.
  • Decision-Making: Abstract thinking aids in decision-making, as it helps us consider the potential consequences, evaluate different options, and anticipate long-term effects. By thinking abstractly, we can make informed choices and weigh the pros and cons of each alternative.

Tips for Enhancing Abstract Thinking

While abstract thinking comes naturally to some individuals, it can also be developed and strengthened through practice. Here are a few tips to enhance your abstract thinking abilities:

  • Embrace Curiosity: Cultivate a curious mindset and ask questions that encourage deeper thinking.
  • Engage in Creative Activities: Explore art, music, writing, or any activity that encourages abstract thinking and self-expression.
  • Read Widely: Engage with diverse literature and expose yourself to different perspectives, ideologies, and worldviews.
  • Practice Symbolic Reasoning: Analyze symbols, metaphors, and allegories in various forms of media to develop your ability to interpret abstract representations.
  • Investigate Opposing Views: Challenge your own beliefs by seeking out and critically evaluating opposing viewpoints.
  • Play Brain-stimulating Games: Engage in puzzles, riddles, and strategy games that require abstract thinking and problem-solving.

Abstract thinking is a remarkable cognitive ability that allows us to explore the world beyond its concrete boundaries . By unlocking the power of imagination, symbolism, and conceptualization, abstract thinking enriches our lives and enables us to navigate the complexities of our existence. Enhancing our abstract thinking skills not only empowers us intellectually but also enhances our creativity, problem-solving abilities, and decision-making skills.

Engineering Problem-Solving

  • First Online: 21 September 2022

Cite this chapter

Book cover

  • Michelle Blum 2  

543 Accesses

You are becoming an engineer to become a problem solver. That is why employers will hire you. Since problem-solving is an essential portion of the engineering profession, it is necessary to learn approaches that will lead to an acceptable resolution. In real-life, the problems engineers solve can vary from simple single solution problems to complex opened ended ones. Whether simple or complex, problem-solving involves knowledge, experience, and creativity. In college, you will learn prescribed processes you can follow to improve your problem-solving abilities. Also, you will be required to solve an immense amount of practice and homework problems to give you experience in problem-solving. This chapter introduces problem analysis, organization, and presentation in the context of the problems you will solve throughout your undergraduate education.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

https://www.merriam-webster.com/dictionary , viewed June 3, 2021.

Mark Thomas Holtzapple, W. Dan Reece (2000), Foundations of Engineering, McGraw-Hill, New York, New York, ISBN:978-0-07-029706-7.

Google Scholar  

Aide, A.R., Jenison R.D., Mickelson, S.K., Northup, L.L., Engineering Fundamentals and Problem Solving, McGraw-Hill, New York, NY, ISBN: 978-0-07-338591-4.

Download references

Author information

Authors and affiliations.

Syracuse University, Syracuse, NY, USA

Michelle Blum

You can also search for this author in PubMed   Google Scholar

End of Chapter Problems

1.1 ibl questions.

IBL1: Using standard problem-solving technique, answer the following questions

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, draw the vector representation of your path (hint: use a compass legend to help create your coordinate system)

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, explain how to calculate the velocity you ran in the north direction.

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, explain how to calculate the velocity you ran in the east direction.

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, explain how to calculate how far you ran in the north direction.

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, explain how to calculate how far you ran in the east direction.

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, how far north have you traveled in 5 min?

If you run in a straight line at a velocity of 10 mph in a direction of 35 degree North of East, how far east have you traveled in 5 min?

What type of problem did you solve?

IBL2: For the following scenarios, explain what type of problem it is that needs to be solved.

Scientists hypothesize that PFAS chemicals in lawn care products are leading to an increase in toxic algae blooms in lakes during summer weather.

An engineer notices that a manufacturing machine motor hums every time the fluorescent floor lights are turned on.

The U.N. warns that food production must be increased by 60% by 2050 to keep up with population growth demand.

Engineers are working to identify and create viable alternative energy sources to combat climate change.

1.2 Practice Problems

Make sure all problems are written up using appropriate problem-solving technique and presentation.

The principle of conservation of energy states that the sum of the kinetic energy and potential energy of the initial and final states of an object is the same. If an engineering student was riding in a 200 kg roller coaster car that started from rest at 10 m above the ground, what is the velocity of the car when it drops to 2.5 m above the ground?

Archimedes’ principle states that the total mass of a floating object equals the mass of the fluid displaced by the object. A 45 cm cylindrical buoy is floating vertically in the water. If the water density is 1.00 g/cm 3 and the buoy plastic has a density of 0.92 g/cm 3 determine the length of the buoy that is not submerged underwater.

A student throws their textbook off a bridge that is 30 ft high. How long would it take before the book hits the ground?

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this chapter

Blum, M. (2022). Engineering Problem-Solving. In: An Inquiry-Based Introduction to Engineering. Springer, Cham. https://doi.org/10.1007/978-3-030-91471-4_6

Download citation

DOI : https://doi.org/10.1007/978-3-030-91471-4_6

Published : 21 September 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-91470-7

Online ISBN : 978-3-030-91471-4

eBook Packages : Engineering Engineering (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

A Detailed Characterization of the Expert Problem-Solving Process in Science and Engineering: Guidance for Teaching and Assessment

  • Argenta M. Price
  • Candice J. Kim
  • Eric W. Burkholder
  • Amy V. Fritz
  • Carl E. Wieman

*Address correspondence to: Argenta M. Price ( E-mail Address: [email protected] ).

Department of Physics, Stanford University, Stanford, CA 94305

Search for more papers by this author

Graduate School of Education, Stanford University, Stanford, CA 94305

School of Medicine, Stanford University, Stanford, CA 94305

Department of Electrical Engineering, Stanford University, Stanford, CA 94305

A primary goal of science and engineering (S&E) education is to produce good problem solvers, but how to best teach and measure the quality of problem solving remains unclear. The process is complex, multifaceted, and not fully characterized. Here, we present a detailed characterization of the S&E problem-solving process as a set of specific interlinked decisions. This framework of decisions is empirically grounded and describes the entire process. To develop this, we interviewed 52 successful scientists and engineers (“experts”) spanning different disciplines, including biology and medicine. They described how they solved a typical but important problem in their work, and we analyzed the interviews in terms of decisions made. Surprisingly, we found that across all experts and fields, the solution process was framed around making a set of just 29 specific decisions. We also found that the process of making those discipline-general decisions (selecting between alternative actions) relied heavily on domain-specific predictive models that embodied the relevant disciplinary knowledge. This set of decisions provides a guide for the detailed measurement and teaching of S&E problem solving. This decision framework also provides a more specific, complete, and empirically based description of the “practices” of science.

INTRODUCTION

Many faculty members with new graduate students and many managers with employees who are recent college graduates have had similar experiences. Their advisees/employees have just completed a program of rigorous course work, often with distinction, but they seem unable to solve the real-world problems they encounter. The supervisor struggles to figure out exactly what the problem is and how they can guide the person in overcoming it. This paper is providing a way to answer those questions in the context of science and engineering (S&E). By characterizing the problem-solving process of experts, this paper investigates the “mastery” performance level and specifies an overarching learning goal for S&E students, which can be taught and measured to improve teaching.

The importance of problem solving as an educational outcome has long been recognized, but too often postsecondary S&E graduates have serious difficulties when confronted with real-world problems ( Quacquarelli Symonds, 2018 ). This reflects two long-standing educational problems with regard to problem solving: how to properly measure it, and how to effectively teach it. We theorize that the root of these difficulties is that good “problem solving” is a complex multifaceted process, and the details of that process have not been sufficiently characterized. Better characterization of the problem-solving process is necessary to allow problem solving, and more particularly, the complex set of skills and knowledge it entails, to be measured and taught more effectively. We sought to create an empirically grounded conceptual framework that would characterize the detailed structure of the full problem-solving process used by skilled practitioners when solving problems as part of their work. We also wanted a framework that would allow use and comparison across S&E disciplines. To create such a framework, we examined the operational decisions (choices among alternatives that result in subsequent actions) that these practitioners make when solving problems in their discipline.

Various aspects of problem solving have been studied across multiple domains, using a variety of methods (e.g., Newell and Simon, 1972 ; Dunbar, 2000 ; National Research Council [NRC], 2012b ; Lintern et al. , 2018 ). These ranged from expert self-reflections (e.g., Polya, 1945 ), to studies on knowledge lean tasks to discover general problem-solving heuristics (e.g., Egan and Greeno, 1974 ), to comparisons of expert and novice performances on simplified problems across a variety of disciplines (e.g., Chase and Simon, 1973 ; Chi et al. , 1981 ; Larkin and Reif, 1979 ; Ericsson et al. , 2006 , 2018 ). These studies revealed important novice–expert differences—notably, that experts are better at identifying important features and have knowledge structures that allow them to reduce demands on working memory. Studies that specifically gave the experts unfamiliar problems in their disciplines also found that, relative to novices, they had more deliberate and reflective strategies, including more extensive planning and managing of their own behavior, and they could use their knowledge base to better define the problem ( Schoenfeld, 1985 ; Wineburg, 1998 ; Singh, 2002 ). While these studies focused on discrete cognitive steps of the individual, an alternative framing of problem solving has been in terms of “ecological psychology” of “situativity,” looking at how the problem solver views and interacts with the environment in terms of affordances and constraints ( Greeno, 1994 ). “Naturalistic decision making” is a related framework that specifically examines how experts make decisions in complex, real-world, settings, with an emphasis on the importance of assessing the situation surrounding the problem at hand ( Klein, 2008 ; Mosier et al. , 2018 ).

While this work on expertise has provided important insights into the problem-solving process, its focus has been limited. Most has focused on looking for cognitive differences between experts and novices using limited and targeted tasks, such as remembering the pieces on a chessboard ( Chase and Simon, 1973 ) or identifying the important concepts represented in an introductory physics textbook problem ( Chi et al. , 1981 ). It did not attempt to explore the full process of solving, particularly for solving the type of complex problem that a scientist or engineer encounters as a member of the workforce (“authentic problems”).

There have also been many theoretical proposals as to expert problem-solving practices, but with little empirical evidence as to their completeness or accuracy (e.g., Polya, 1945 ; Heller and Reif, 1984 ; Organisation for Economic Cooperation and Development [OECD], 2019 ). The work of Dunbar (2000) is a notable exception to the lack of empirical work, as his group did examine how biologists solved problems in their work by analyzing lab meetings held by eight molecular biology research groups. His groundbreaking work focused on creativity and discovery in the research process, and he identified the importance of analogical reasoning and distributed reasoning by scientists in answering research questions and gaining new insights. Kozma et al. (2000) studied professional chemists solving problems, but their work focused only on the use of specialized representations.

The “cognitive systems engineering” approach ( Lintern et al. , 2018 ) takes a more empirically based approach looking at experts solving problems in their work, and as such tends to span aspects of both the purely cognitive and the ecological psychological theories. It uses both observations of experts in authentic work settings and retrospective interviews about how experts carried out particular work tasks. This theoretical framing and the experimental methods are similar to what we use, particularly in the “naturalistic decision making” area of research ( Mosier et al. , 2018 ). That work looks at how critical decisions are made in solving specific problems in their real-world setting. The decision process is studied primarily through retrospective interviews about challenging cases faced by experts. As described below, our methods are adapted from that work ( Crandall et al. , 2006 ), though there are some notable differences in focus and field. A particular difference is that we focused on identifying what are decisions to be made, which are more straight-forward to identify from retrospective interviews than how those decisions are made. We all have the same ultimate goal, however, to improve the training/teaching of the respective expertise.

Problem solving is central to the processes of science, engineering, and medicine, so research and educational standards about scientific thinking and the process and practices of science are also relevant to this discussion. Work by Osborne and colleagues describes six styles of scientific reasoning that can be used to explain how scientists and students approach different problems ( Kind and Osborne, 2016 ). There are also numerous educational standards and frameworks that, based on theory, lay out the skills or practices that science and engineering students are expected to master (e.g., American Association for the Advancement of Science [AAAS], 2011 ; Next Generation Science Standards Lead States, 2013 ; OECD, 2019 ; ABET, 2020 ). More specifically related to the training of problem solving, Priemer et al. (2020) synthesizes literature on problem solving and scientific reasoning to create a “STEM [science, technology, engineering, and mathematics] and computer science framework for problem solving” that lays out steps that could be involved in a students’ problem-solving efforts across STEM fields. These frameworks provide a rich groundwork, but they have several limitations: 1) They are based on theoretical ideas of the practice of science, not empirical evidence, so while each framework contains overlapping elements of the problem-solving process, it is unclear whether they capture the complete process. 2) They are focused on school science, rather than the actual problem solving that practitioners carry out and that students will need to carry out in future STEM careers. 3) They are typically underspecified, so that the steps or practices apply generally, but it is difficult to translate them into measurable learning goals for students to practice. Working to address that, Clemmons et al. (2020) recently sought to operationalize the core competencies from the Vision and Change report ( AAAS, 2011 ), establishing a set of skills that biology students should be able to master.

Our work seeks to augment this prior work by building a conceptual framework that is empirically based, grounded in how scientists and engineers solve problems in practice instead of in school. We base our framework on the decisions that need to be made during problem solving, which makes each item clearly defined for practice and assessment. In our analysis of expert problem solving, we empirically identified the entire problem-solving process. We found this includes deciding when and how to use the steps and skills defined in the work described previously but also includes additional elements. There are also questions in the literature about how generalizable across fields a particular set of practices may be. Here, we present the first empirical examination of the entire problem-solving process, and we compare that process across many different S&E disciplines.

A variety of instructional methods have been used to try and teach science and engineering problem solving, but there has been little evidence of their efficacy at improving problem solving (for a review, see NRC, 2012b ). Research explicitly on teaching problem solving has primarily focused on textbook-type exercises and utilized step-by-step strategies or heuristics. These studies have shown limited success, often getting students to follow specific procedural steps but with little gain in actually solving problems and showing some potential drawbacks ( Heller and Reif, 1984 ; Heller et al. , 1992 ; Huffman, 1997 ; Heckler, 2010 ; Kuo et al. , 2017 ). As discussed later, the framework presented here offers guidance for different and potentially more effective approaches to teaching problem solving.

These challenges can be illustrated by considering three different problems taken from courses in mechanical engineering, physics, and biology, respectively ( Figure 1 ). All of these problems are challenging, requiring considerable knowledge and effort by the student to solve correctly. Problems such as these are routinely used to both assess students’ problem-solving skills, and students are expected to learn such skills by practicing doing such problems. However, it is obvious to any expert in the respective fields, that, while these problems might be complicated and difficult to answer, they are vastly different from solving authentic problems in that field. They all have well-defined answers that can be reached by straightforward solution paths. More specifically, they do not involve needing to use judgment to make any decisions based on limited information (e.g., insufficient to specify a correct decision with certainty). The relevant concepts and information and assumptions are all stated or obvious. The failure of problems like these to capture the complexity of authentic problem solving underlies the failure of efforts to measure and teach problem solving. Recognizing this failure motivated our efforts to more completely characterize the problem-solving process of practicing scientists, engineers, and doctors.

FIGURE 1. Example problems from courses or textbooks in mechanical engineering, physics and biology. Problems from: Mechanical engineering: Wayne State mechanical engineering sample exam problems (Wayne State, n.d.), Physics: A standard physics problem in nearly every advanced quantum mechanics course, Biology: Molecular Biology of the Cell 6th edition, Chapter 7 end of chapter problems ( Alberts et al ., 2014 ).

We are building on the previous work studying expert–novice differences and problem solving but taking a different direction. We sought to create an empirically grounded framework that would characterize the detailed structure of the full problem-solving process by focusing on the operational decisions that skilled practitioners make when successfully solving authentic problems in their scientific, engineering, or medical work. We chose to identify the decisions that S&E practitioners made, because, unlike potentially nebulous skills or general problem-solving steps that might change with the discipline, decisions are sufficiently specified that they can be individually practiced by students and measured by instructors or departments. The authentic problems that we analyzed are typical problems practitioners encounter in “doing” the science or engineering entailed in their jobs. In the language of traditional problem-
solving and expertise research, such authentic problems are “ill-structured” ( Simon, 1973 ) and require “adaptive expertise” ( Hatano and Inagaki, 1986 ) to solve. However, our authentic problems are considerably more complex and unstructured than what is normally considered in those literatures, because not only do they lack a clear solution path, but in many cases, it is not clear a priori that they have any solution at all. Determining that, and whether the problem needs to be redefined to be soluble, is part of the successful expert solution process. Another way in which our set of decisions goes beyond the characterization of what is involved in adaptive expertise is the prominent role of making judgments with limited information.

A common reaction of scientists and engineers to seeing the list of decisions we obtain as our primary result is, “Oh, yes, these are things I always do in solving problems. There is nothing new here.” It is comforting that these decisions all look familiar; that supports their validity. However, what is new is not that experts are making such decisions, but rather that there is a relatively small but complete set of decisions that has now been explicitly identified and that applies so generally.

We have used a much larger and broader sample of experts in this work than used in prior expert–novice studies, and we used a more stringent selection criterion. Previous empirical work has typically involved just a few experts, almost always in a single domain, and included graduate students as “experts” in some cases. Our semistructured interview sample was 31 experienced practitioners from 10 different disciplines of science, engineering, and medicine, with demonstrated competence and accomplishments well beyond those of most graduate students. Also, approximately 25 additional experts from across science, engineering, and medicine served as consultants during the planning and execution of this work.

Our research question was: What are the decisions experts make in solving authentic problems, and to what extent is this set of decisions to be made consistent both within and across disciplines?

Our approach was designed to identify the level of consistency and unique differences across disciplines. Our hypothesis was that there would be a manageable number (20–50) of decisions to be made, with a large amount of overlap of decisions made between experts within each discipline and a substantial but smaller overlap across disciplines. We believed that if we had found that every expert and/or discipline used a large and completely unique set of decisions, it would have been an interesting research result but of little further use. If our hypothesis turned out to be correct, we expected that the set of decisions obtained would have useful applications in guiding teaching and assessment, as they would show how experts in the respective disciplines applied their content knowledge to solve problems and hence provide a model for what to teach. We were not expecting to find the nearly complete degree of overlap in the decisions made across all the experts.

We first conducted 22 relatively unstructured interviews with a range of S&E experts, in which we asked about problem-solving expertise in their fields. From these interviews, we developed an initial list of decisions to be made in S&E problem solving. To refine and validate the list, we then carried out a set of 31 semistructured interviews in which S&E experts chose a specific problem from their work and described the solution process in detail. The semistructured interviews were coded for the decisions represented, either explicitly stated or implied by a choice of action. This provided a framework of decisions that characterize the problem-solving process across S&E disciplines. The research was approved by the Stanford Institutional Review Board (IRB no. 48785), and informed consent was obtained from all the participants.

This work involved interviewing many experts across different fields. We defined experts as practicing scientists, engineers, or physicians with considerable experience working as faculty at highly rated universities or having several years of experience working in moderately high-level technical positions at successful companies. We also included a few longtime postdocs and research staff in biosciences to capture more details of experimental decisions from which faculty members in those fields often were more removed. This definition of expert allows us to identify the practices of skilled professionals; we are not studying what makes only the most exceptional experts unique.

Experts were volunteers recruited through direct contact via the research team's personal and professional networks and referrals from experts in our networks. This recruitment method likely biased our sample toward people who experienced relatively similar training (most were trained in STEM disciplines at U.S. universities within the last 15–50 years). Within this limitation, we attempted to get a large range of experts by field and experience. This included people from 10 different fields (including molecular biology/biochemistry, ecology, and medicine), 11 U.S. universities, and nine different companies or government labs, and the sample was 33% female (though our engineering sample only included one female). The medical experts were volunteers from a select group of medical school faculty chosen to serve as clinical reasoning mentors for medical students at a prestigious university. We only contacted people who met our criteria for being an “expert,” and everyone who volunteered was included in the study. Most of the people who were contacted volunteered, and the only reason given for not volunteering was insufficient time. Other than their disciplinary expertise, there was little to distinguish these experts beyond the fact they were acquaintances with members of the team or acquaintances of acquaintances of team or project advisory board members. The precise number from each field was determined largely by availability of suitable experts.

We defined an “authentic problem” to be one that these experts solve in their actual jobs. Generally, this meant research projects for the science and engineering faculty, design problems for the industry engineers, and patient diagnoses for the medical doctors. Such problems are characterized by complexity, with many factors involved and no obvious solution process, and involve substantial time, effort, and resources. Such problems involve far more complexity and many more decisions, particularly decisions with limited information, than the typical problems used in previous problem-solving research or used with students in instructional settings.

Creating an Initial List of Problem-Solving Decisions

We first interviewed 22 experts ( Table 1 ), most of whom were faculty at a prestigious university, in which we asked them to discuss expertise and problem solving in their fields as it related to their own experiences. This usually resulted in their discussing examples of one or more problems they had solved. Based on the first seven interviews, plus reflections on personal experience from the research team and review of the literature on expert problem solving and teaching of scientific practices ( Ericsson et al. , 2006 ; NRC, 2012a ; Wieman, 2015 ), we created a generic list of decisions that were made in S&E problem solving. In the rest of the unstructured interviews (15), we also provided the experts with our list and asked them to comment on any additions or deletions they would suggest. Faculty who had close supervision of graduate students and industry experts who had extensively supervised inexperienced staff were particularly informative. Their observations of the way inexperienced people could fail made them sensitive to the different elements of expertise and where incorrect decisions could be made. Although we initially expected to find substantial differences across disciplines, from early in the process, we noted a high degree of overlap across the interviews in the decisions that were described.

URM (under-represented minority) included 3 African American and 2 Hispanic/Latinx. One medical faculty member was interviewed twice – in both informal and structure interviews, for a total of 53 interviews with 52 experts.

Refinement and Validation of the List of Decisions

After creating the preliminary list of decisions from the informal interviews, we conducted a separate set of more structured interviews to test and refine the list. Semistructured interviews were conducted with 31 experts from across science, engineering, and medical fields ( Table 1 ). For these interviews, we recruited experts from a range of universities and companies, though the range of institutions is still limited, given the sample size. Interviews were conducted in person or over video chat and were transcribed for analysis. In the semistructured interviews, experts were asked to choose a problem or two from their work that they could recall the details of solving and then describe the process, including all the steps and decisions they made. So that we could get a full picture of the successful problem-solving process, we decided to focus the interviews on problems that they had eventually solved successfully, though their processes inherently involved paths that needed to be revised and reconsidered. Transcripts from interviewees who agreed to have their interview transcript published are available in the supplemental data set.

Our interview protocol (see Supplemental Text) was inspired in part by the critical decision method of cognitive task analysis ( Crandall et al. , 2006 ; Lintern et al. , 2018 ), which was created for research in cognitive systems engineering and naturalistic decision making. There are some notable differences between our work and theirs, both in research goal and method. First, their goal is to improve training in specific fields by focusing on how critical decisions are made in that field during an unusual or important event; the analysis seeks to identify factors involved in making those critical decisions. We are focusing on the overall problem solving and how it compares across many different fields, which quickly led to attention on what decisions are to be made, rather than how a limited set of those decisions are made. We asked experts to describe a specific, but not necessarily unusual, problem in their work, and focused our analysis on identifying all decisions made, not reasons for making them or identifying which were most critical. The specific order of problem-solving steps was also less important to us, in part because it was clear that there was no consistent order that was followed. Second, we are looking at different types of work. Cognitive systems engineering work has primarily focused on performance in professions like firefighters, power plant operators, military technicians, and nurses. These tend to require time-sensitive critical skills that are taught with modest amounts of formal training. We are studying scientists, engineers, and doctors solving problems that require much longer and less time-critical solutions and for which the formal training occupies many years.

Given our different focus, we made several adaptations to eliminate some of the more time-consuming steps from the interview protocol, allowing us to limit the interview time to approximately 1 hour. Both protocols seek to elicit an accurate and complete reporting of the steps taken and decisions made in the process of solving a problem. Our general strategy was: 1) Have the expert explain the problem and talk step by step through the decisions involved in solving it, with relatively few interruptions from the interviewer except to keep the discussion focused on the specific problem and occasionally to ask for clarifications. 2) Ask follow-up questions to probe for more detail about particular steps and aspects of the problem-solving process. 3) Occasionally ask for general thoughts on how a novice's process might differ.

While some have questioned the reliability of information from retrospective interviews ( Nisbett and Wilson, 1977 ), we believe we avoid these concerns, because we are only identifying a decision to be made, which in this case, means identifying a well-defined action that was chosen from alternatives. This is less subjective and much more likely to be accurately recalled than is the rationale behind such a decision. See Ericsson and Simon (1980) . However, the decisions identified may still be somewhat limited—the process of deciding among possible actions might involve additional decisions in the moment, when the solution is still unknown, that we are unable to capture in the retrospective context. For the decisions we can identify, we are able to check their accuracy and completeness by comparing them with the actions taken in the conduct of the research/design. For example, consider this quote from a physician who had to re-evaluate a diagnosis, “And, in my very subjective sense, he seemed like he was being forthcoming and honest. Granted people can fool you, but he seemed like he was being forthcoming. So we had to reevaluate.” The physician then considered alternative diagnoses that could explain a test result that at first had indicated an incorrect diagnosis. While this quote does describe the (retrospective) reasoning behind a decision, we do not need to know whether that reasoning is accurately recalled. We can simply code this as “decision 18, how believable is info?” The physician followed up by considering alternative diagnoses, which in this context was coded as “26, how good is solution?” and “8, potential solutions?” This was followed by the description of the literature and additional tests conducted. These indicated actions taken that confirm the physician made a decision about the reliability of the information given by the patient.

Interview Coding

We coded the semistructured interviews in terms of decisions made, through iterative rounds of coding ( Chi, 1997 ), following a “directed content analysis approach,” which involves coding according to predefined theoretical categories and updating the codes as needed based on the data ( Hsieh and Shannon, 2005 ). Our predefined categories were the list of decisions we had developed during the informal interviews. This approach means that we limited the focus of our qualitative analysis—we were able to test and refine the list of decisions, but we did not seek to identify all possible categories of approach to selecting and solving problems. The goals of each iterative round of coding are described in the next three paragraphs. To code for decisions in general, we matched decisions from the list to statements in each interview, based on the following criteria: 1) there was an explicit statement of a decision or choice made or needing to be made; 2) there was the description of the outcome of a decision, such as listing important features of the problem (that had been decided on) or conclusions arrived at; or 3) there was a statement of actions taken that indicated a decision about the appropriate action had been made, usually from a set of alternatives. Two examples illustrate the types of comments we identified as decisions: A molecular biologist explicitly stated the decisions required to decompose a problem into subproblems (decision 11), “Which cell do we use? The gene. Which gene do we edit? Which part of that gene do we edit? How do we build the enzyme that is going to do the cutting? … And how do we read out that it worked?” An ecologist made a statement that was also coded as a decomposition decision, because it described the action taken: “So I analyze the bird data first on its own, rather than trying to smash all the taxonomic groups together because they seem really apples and oranges. And just did two kinds of analysis, one was just sort of across all of these cases, around the world.” A single statement could be coded as multiple decisions if they were occurring simultaneously in the story being recalled or were intimately interconnected in the context of that interview, as with the ecology quote, in which the last sentence leads into deciding what data analysis is needed. Inherent in nearly every one of these decisions was that there was insufficient information to know the answer with certainty, so judgment was required.

Our primary goal for the first iterative round of coding was to check whether our list was complete by checking for any decisions that were missing, as indicated by either an action taken or a stated decision that was not clearly connected to a decision on our initial list. In this round, we also clarified wording and combined decisions that we were consistently unable to differentiate during the coding. A sample of three interviews (from biology, medicine, and electrical engineering) were first coded independently by four coders (AP, EB, CK, and AF), then discussed. The decision list was modified to add decisions and update wording based on that discussion. Then the interviews were recoded with the new list and rediscussed, leading to more refinements to the list. Two additional interviews (from physics and chemical engineering) were then coded by three coders (AP, EB, and CK) and further similar refinements were made. Throughout the subsequent rounds of coding, we continued to check for missing decisions, but after the additions and adjustments made based on these five interviews, we did not identify any more missing decisions.

In our next round of coding, we focused on condensing overlapping decisions and refining wording to improve the clarity of descriptions as they applied across different disciplinary contexts and to ensure consistent interpretation by different coders. Two or three coders independently coded an additional 11 interviews, iteratively meeting to discuss codes identified in the interviews, refining wording and condensing the list to improve agreement and combine overlapping codes, and then using the updated list to code subsequent interviews. We condensed the list by combining decisions that represented the same cognitive process taking place at different times, that were discipline-specific variations on the same decision, or that were substeps involved in making a larger decision. We noticed that some decisions were frequently co-coded with others, particularly in some disciplines. But if they were identified as distinct a reasonable fraction of the time in any discipline, we listed them as separate. This provided us with a list, condensed from 42 to 29 discrete decisions (plus five additional non-decision themes that were so prevalent that they are important to describe), that gave good consistency between coders.

Finally, we used the resulting codes to tabulate which decisions occurred in each interview, simplifying our coding process to focus on deciding whether or not each decision had occurred, with an example if it did occur to back up the “yes” code, but no longer attempting to capture every time each decision was mentioned. Individual coders identified decisions mentioned in the remaining 15 interviews. Interviews that had been coded with the early versions of the list were also recoded to ensure consistency. Coders flagged any decisions they were unsure about occurring in a particular interview, and two to four coders (AP, EB, CK, and CW) met to discuss those debated codes, with most uncertainties being resolved by explanations from a team member who had more technical expertise in the field of the interview. Minor wording changes were made during this process to ensure that each description of a decision captured all instantiations of the decision across disciplines, but no significant changes to the list were needed or made.

Coding an interview in terms of decisions made and actions taken in the research often required a high level of expertise in the discipline in question. The coder had to be familiar with the conduct of research in the field in order to recognize which actions corresponded to a decision between alternatives, but our team was assembled with this requirement in mind. It included high-level expertise across five different fields of science, engineering, and medicine and substantial familiarity with several other fields.

Supplemental Table S1 shows the final tabulation of decisions identified in each interview. In the tabulation, most decisions were marked as either “yes” or “no” for each interview, though 65 out of 1054 total were marked as “implied,” for one of the following reasons: 1) for 40/65, based on the coder's knowledge of the field, it was clear that a step must have been taken to achieve an outcome or action, even though that decision was not explicitly mentioned (e.g., interviewees describe collecting certain raw data and then coming to a specific conclusion, so they must have decided how to analyze the data, even if they did not mention the analysis explicitly); 2) for 15/65, the interview context was important, in that multiple statements from different parts of the interview taken together were sufficient to conclude that the decision must have happened, though no single statement described that decision explicitly; 3) 10/65 involved a decision that was explicitly discussed as an important step in problem solving, but they did not directly state how it was related to the problem at hand, or it was stated only in response to a direct prompt from the interviewer. The proportion of decisions identified in each interview, broken down by either explicit or explicit + implied, is presented in Supplemental Tables S1 and S2. Table 2 and Figure 2 of the main text show explicit + implied decision numbers.

a See supplementary text and Table S2 for full description and examples of each decision. A set of other non-decision knowledge and skill development themes were also frequently mentioned as important to professional success: Staying up to date in the field (84%), intuition and experience (77%), interpersonal and teamwork (100%), efficiency (32%), and attitude (68%).

b Percentage of interviews in which category or decision was mentioned.

c Numbering is for reference. In practice ordering is fluid – involves extensive iteration with other possible starting points.

d Chosen predictive framework(s) will inform all other decisions.

e Reflection occurs throughout process, and often leads to iteration. Reflection on solution occurs at the end as well.

FIGURE 2. Proportion of decisions coded in interviews by field. This tabulation includes decisions 1–29, not the additional themes. Error bars represent standard deviations. Number of interviews: total = 31; physical science = 9; biological science = 8; engineering = 8; medicine = 6. Compared with the sciences, slightly fewer decisions overall were identified in the coding of engineering and medicine interviews, largely for discipline-specific reasons. See Supplemental Table S2 and associated discussion.

Two of the interviews that had not been discussed during earlier rounds of coding (one physics [AP and EB], one medicine [AP and CK]) were independently coded by two coders to check interrater reliability using the final list of decisions. The goal of our final coding was to tabulate whether or not each expert described making each decision at any point in the problem-solving process, so the level of detail we chose for coding and interrater reliability was whether or not a decision was present in the entire interview. The decisions identified in each interview were compared for the two coders. For both interviews, the raters disagreed on whether or not only one of the 29 decisions occurred. Codes of “implied” were counted as agreement if the other coder selected either “yes” or “implied.” This equates to a percent agreement of 97% for each interview (28 agree/29 total decisions per interview = 97%). As a side note, there was also one disagreement per interview on the coding of the five other themes, but those themes were not a focus of this work nor the interviews.

We identified a total set of 29 decisions to be made (plus five other themes), all of which were identified in a large fraction of the interviews across all disciplines ( Table 2 and Figure 2 ). There was a surprising degree of overlap across the different fields with all the experts mentioning similar decisions to be made. All 29 were evident by the fifth semistructured interview, and on average, each interview revealed 85% of the 29 decisions. Many decisions occurred multiple times in an interview, with the number of times varying widely, depending on the length and complexity of the problem-solving process discussed.

We focused our analysis on what decisions needed to be made, not on the experts’ processes for making those decisions: noting that a choice happened, not how they selected and chose among different alternatives. This is because, while the decisions to be made were the same across disciplines, how the experts made those decisions varied greatly by discipline and individual. The process of making the decisions relied on specialized disciplinary knowledge and experience and may vary depending on demographics or other factors that our study design (both our sample and nature of retrospective interviews) did not allow us to investigate. However, while that knowledge was distinct and specialized, we could tell that it was consistently organized according to a common structure we call a “predictive framework,” as discussed in the “ Predictive Framework ” section below. Also, while every “decision” reflected a step in the problem solving involved in the work, and the expert being interviewed was involved in making or approving the decision, that does not mean the decision process was carried out only by that individual. In many cases, the experts described the decisions made in terms of ideas and results of their teams, and the importance of interpersonal skills and teamwork was an important non-decision theme raised in all interviews.

We were particularly concerned with the correctness and completeness of the set of decisions. Although the correctness was largely established by the statements in the interviews, we also showed the list of decisions to these experts at the end of the interviews as well as to about a dozen other experts. In all cases, they all agreed that these decisions were ones they and others in their field made when solving problems. The completeness of the list of decisions was confirmed by: 1) looking carefully at all specific actions taken in the described problem-solving process and checking that each action matched a corresponding decision from the list; and 2) the high degree of consistency in the set of decisions across all the interviews and disciplines. This implies that it is unlikely that there are important decisions that we are missing, because that would require any such missing decisions to be consistently unspoken by all 31 interviewees as well as consistently unrecognized by us from the actions that were taken in the problem-solving process.

In focusing on experts’ recollections of their successful solving of problems, our study design may have missed decisions that experts only made during failed problem-solving attempts. However, almost all interviews described solution paths that were not smooth and continuous, but rather involved going down numerous dead ends. There were approaches that were tried and failed, data that turned out to be ambiguous and worthless, and so on. Identifying the failed path involved reflection decisions (23–26). Often decision 9 (is problem solvable?) would be mentioned, because it described a path that was determined to be not solvable. For example, a biologist explained, “And then I ended up just switching to a different strain that did it [crawling off the plate] less. Because it was just … hard to really get them to behave themselves. I suppose if I really needed to rely on that very particular one, I probably would have exhausted the possibilities a bit more.” Thus, we expect unsuccessful problem solving would entail a smaller subset of decisions being made, particularly lack of reflection decisions, or poor choices on the decisions, rather than making a different set of decisions.

The set of decisions represent a remarkably consistent structure underlying S&E problem solving. For the purposes of presentation, we have categorized the decisions as shown in Figure 3 , roughly based on the purposes they achieve. However, the process is far less orderly and sequential than implied by this diagram, or in fact any characterization of an orderly “scientific method.” We were struck by how variable the sequence of decisions was in the descriptions provided. For example, experts who described how they began work on a problem sometimes discussed importance and goals (1–3, what is important in field?; opportunity fits solver’s expertise?; and goals, criteria, constraints?), but others mentioned a curious observation (20, any significant anomalies?), important features of their system that led them to questions (4, important features and info?, 6, how to narrow down problem?), or other starting points. We also saw that there were flexible connections between decisions and repeated iterations—jumping back to the same type of decision multiple times in the solution process, often prompted by reflection as new information and insights were developed. The sequence and number of iterations described varied dramatically by interview, and we cannot determine to what extent this was due to legitimate differences in the problem-solving process or to how the expert recalled and chose to describe the process. This lack of a consistent starting point, with jumping and iterating between decisions, has also been identified in the naturalistic decision-making literature ( Mosier et al. , 2018 ). Finally, the experts also often described considering multiple decisions simultaneously. In some interviews, a few decisions were always described together, while in others, they were clearly separate decisions. In summary, while the specific decisions themselves are fully grounded in expert practice, the categories and order shown here are artificial simplifications for presentation purposes.

FIGURE 3. Representation of problem-solving decisions by categories. The black arrows represent a hypothetical but unrealistic order of operations, the blue arrows represent more realistic iteration paths. The decisions are grouped into categories for presentation purposes; numbers indicate the number of decisions in each category. Knowledge and skill development were commonly mentioned themes but are not decisions.

The decisions contained in the seven categories are summarized here. See Supplemental Table S2 for specific examples of each decision across multiple disciplines.

Category A. Selection and Goals of the Problem

This category involves deciding on the importance of the problem, what criteria a solution must meet, and how well it matches the capabilities, resources, and priorities of the expert. As an example, an earth scientist described the goal of her project (decision 3, goals, criteria, constraints?) to map and date the earliest volcanic rocks associated with what is now Yellowstone and explained why the project was a good fit for her group (2, opportunity fits solver’s expertise?) and her decision to pursue the project in light of the significance of this type of eruption in major extinction events (1, what is important in field?). In many cases, decisions related to framing (see category B) were mentioned before decisions in this category or were an integral part of the process for developing goals.

1. What is important in the field?

What are important questions or problems? Where is the field heading? Are there advances in the field that open new possibilities?

2. Opportunity fits solver's expertise?

If and where are there gaps/opportunities to solve in field? Given experts’ unique perspectives and capabilities, are there opportunities particularly accessible to them? (This could involve challenging the status quo, questioning assumptions in the field.)

3. Goals, criteria, constraints?

a. What are the goals, design criteria, or requirements of the problem or its solution?

b. What is the scope of the problem?

c. What constraints are there on the solution?

d. What will be the criteria on which the solution is evaluated?

Category B. Frame Problem

These decisions lead to a more concrete formulation of the solution process and potential solutions. This involves identifying the key features of the problem and deciding on predictive frameworks to use (see “ Predictive Framework ” section below), as well as narrowing down the problem, often forming specific questions or hypotheses. Many of these decisions are guided by past problem solutions with which the expert is familiar and sees as relevant. The framing decisions of a physician can be seen in his discussion of a patient with liver failure who had previously been diagnosed with HIV but had features (4, important features and info?; 5, what predictive framework?) that made the physician question the HIV diagnosis (5, what predictive framework?; 26, how good is solution?). His team then searched for possible diagnoses that could explain liver failure and lead to a false-positive HIV test (7, related problems?; 8, potential solutions?), which led to their hypothesis the patient might have Q fever (6, how to narrow down problem?; 13, what info needed?; 15, specific plan for getting info?). While each individual decision is strongly supported by the data, the categories are groupings for presentation purposes. In particular, framing (category B) and planning (see category C) decisions often blended together in interviews.

a. Which available information is relevant to problem solving and why?

b. (When appropriate) Create/find a suitable abstract representation of core ideas and information Examples: physics, equation representing process involved; chemistry, bond diagrams/potential energy surfaces; biology, diagram of pathway steps.

5. What predictive framework?

Which potential predictive frameworks to use? (Decide among possible predictive frameworks or create framework.) This includes deciding on the appropriate level of mechanism and structure that the framework needs to embody to be most useful for the problem at hand.

6. How to narrow down the problem?

How to narrow down the problem? Often involves formulating specific questions and hypotheses.

7. Related problems?

What are related problems or work seen before, and what aspects of their problem-solving process and solutions might be useful in the present context? (This may involve reviewing literature and/or reflecting on experience.)

8. Potential solutions?

What are potential solutions? (This is based on experience and fitting some criteria for solution they have for a problem having general key features identified.)

9. Is problem solvable?

Is the problem plausibly solvable and is the solution worth pursuing given the difficulties, constraints, risks, and uncertainties?

Category C. Plan the Process for Solving

These decisions establish the specifics needed to solve the problem and include: how to simplify the problem and decompose it into pieces, what specific information is needed, how to obtain that information, and what are the resources needed and priorities? Planning by an ecologist can be seen in her extensive discussion of her process of simplifying (10, approximations/simplifications to make?) a meta-analysis project about changes in migration behavior, which included deciding what types of data she needed (13, what info needed?), planning how to conduct her literature search (15, specific plan for getting info?), difficulties in analyzing the data (12, most difficult/uncertain areas?; 16, which calculations and data analysis?), and deciding to analyze different taxonomic groups separately (11, how to decompose into subproblems?). In general, decomposition often resulted in multiple iterations through the problem-solving decisions, as subsets of decisions need to be made about each decomposed aspect of a problem. Framing (category B) and planning (category C) decisions occupied much of the interviews, indicating their importance.

10. Approximations and simplifications to make?

What approximations or simplifications are appropriate? How to simplify the problem to make it easier to solve? Test possible simplifications/approximations against established criteria.

11. How to decompose into subproblems?

How to decompose the problem into more tractable subproblems? (Subproblems are independently solvable pieces with their own subgoals.)

12. Most difficult or uncertain areas?

a. What are acceptable levels of uncertainty with which to proceed at various stages?

13. What info needed?

a. What will be sufficient to test and distinguish between potential solutions?

14. Priorities?

What to prioritize among many competing considerations? What to do first and how to obtain necessary resources?

Considerations could include: What's most important? Most difficult? Addressing uncertainties? Easiest? Constraints (time, materials, etc.)? Cost? Optimization and trade-offs? Availability of resources? (facilities/materials, funding sources, personnel)

15. Specific plan for getting information?

a. What are the general requirements of a problem-solving approach, and what general approach will they pursue? (These decisions are often made early in the problem-solving process as part of framing.)

b. How to obtain needed information? Then carry out those plans. (This could involve many discipline- and problem-specific investigation possibilities such as: designing and conducting experiments, making observations, talking to experts, consulting the literature, doing calculations, building models, or using simulations.)

c. What are achievable milestones, and what are metrics for evaluating progress?

d. What are possible alternative outcomes and paths that may arise during the problem-solving process, both consistent with predictive framework and not, and what would be paths to follow for the different outcomes?

Category D. Interpret Information and Choose Solution(s)

This category includes deciding how to analyze, organize, and draw conclusions from available information, reacting to unexpected information, and deciding upon a solution. A biologist studying aging in worms described how she analyzed results from her experiments, which included representing her results in survival curves and conducting statistical analyses (16, which calculations and data analysis?; 17, how to represent and organize info?), as well as setting up blind experiments (15, specific plan for getting info?) so that she could make unbiased interpretations (18, how believable is info?) of whether a worm was alive or dead. She also described comparing results with predictions to justify the conclusion that worm aging was related to fertility (19, how does info compare to predictions?; 21, appropriate conclusions?; 22, what is best solution?). Deciding how results compared with expectations based on a predictive framework was a key decision that often preceded several other decisions.

16. Which calculations and data analysis?

What calculations and data analysis are needed? Once determined, these must then be carried out.

17. How to represent and organize information?

What is the best way to represent and organize available information to provide clarity and insights? (Usually this will involve specialized and technical representations related to key features of predictive framework.)

18. How believable is the information?

Is information valid, reliable, and believable (includes recognizing potential biases)?

19. How does information compare to predictions?

As new information comes in, particularly from experiments or calculations, how does it compare with expected results (based on the predictive framework)?

20. Any significant anomalies?

a. Does potential anomaly fit within acceptable range of predictive framework(s) (given limitations of predictive framework and underlying assumptions and approximations)?

b. Is potential anomaly an unusual statistical variation or relevant data? Is it within acceptable levels of uncertainty?

21. Appropriate conclusions?

What are appropriate conclusions based on the data? (This involves making conclusions and deciding if they are justified.)

22. What is the best solution?

a. Which of multiple candidate solutions are consistent with all available information and which can be rejected? (This could be based on comparing data with predicted results.)

b. What refinements need to be made to candidate solutions?

Category E. Reflect

Reflection decisions occur throughout the process and include deciding whether assumptions are justified, whether additional knowledge or information is needed, how well the solution approach is working, and whether potential and then final solutions are adequate. These decisions match the categories of reflection identified by Salehi (2018) . A mechanical engineer described developing a model (to inform surgical decisions) of which muscles allow the thumb to function in the most useful manner (22, what is best solution?), including reflecting on how well engineering approximations applied in the biological context (23, assumptions and simplifications appropriate?). He also described reflecting on his approach, that is, why he chose to use cadaveric models instead of mathematical models (25, how well is solving approach working?), and the limitations of his findings in that the “best” muscle identified was difficult to access surgically (26, how good is solution?; 27, broader implications?). Reflection decisions are made throughout the problem-solving process, often lead to reconsidering other decisions, and are critical for success.

23. Assumptions and simplifications appropriate?

a. Do the assumptions and simplifications made previously still look appropriate considering new information?

b Does predictive framework need to be modified?

24. Additional knowledge needed?

a. Is solver's relevant knowledge sufficient?

b. Is more information needed and, if so, what?

c. Does some information need to be checked? (Is there a need to repeat experiment or check a different source?)

25. How well is the problem-solving approach working?

How well is the problem-solving approach working, and does it need to be modified? This includes possibly modifying the goals. (One needs to reflect on one's strategy by evaluating progress toward the solution.) and reflecting on one’s strategy by evaluating progress toward the solution.

26. How good is the solution?

a. Decide by exploring possible failure modes and limitations—“try to break” solution.

b. Does it “make sense” and pass discipline-specific tests for solutions of this type of problem?

c. Does it completely meet the goals/criteria?

Category F. Implications and Communication of Results

These are decisions about the broader implications of the work, and how to communicate results most effectively. For example, a theoretical physicist developing a method to calculate the magnetic moment of the muon decided on who would be interested in his work (28, audience for communication?) and what would be the best way to present it (29, best way to present work?). He also discussed the implications of preliminary work on a simplified aspect of the problem (10, approximations and simplifications to make?) in terms of evaluating its impact on the scientific community and deciding on next steps (27, broader implications?; 29, best way to present work?). Many interviewees described that making decisions in this category affected their decisions in other categories.

27. Broader implications?

What are the broader implications of the results, including over what range of contexts does the solution apply? What outstanding problems in the field might it solve? What novel predictions can it enable? How and why might this be seen as interesting to a broader community?

28. Audience for communication?

What is the audience for communication of work, and what are their important characteristics?

29. Best way to present work?

What is the best way to present the work to have it understood, and its correctness and importance appreciated? How to make a compelling story of the work?

Category G. Ongoing Skill and Knowledge Development

Although we focused on decisions in the problem-solving process, the experts volunteered general skills and knowledge they saw as important elements of problem-solving expertise in their fields. These included teamwork and interpersonal skills (strongly emphasized), acquiring experience and intuition, and keeping abreast of new developments in their fields.

30. Stay up to date in field

a. Reviewing literature, which does involve making decisions as to which is important.

b. Learning relevant new knowledge (ideas and technology from literature, conferences, colleagues, etc.)

31. Intuition and experience

Acquiring experience and associated intuition to improve problem solving.

32. Interpersonal, teamwork

Includes navigating collaborations, team management, patient interactions, communication skills, etc., particularly as how these apply in the context of the various types of problem-solving processes.

33. Efficiency

Time management including learning to complete certain common tasks efficiently and accurately.

34. Attitude

Motivation and attitude toward the task. Factors such as interest, perseverance, dealing with stress, and confidence in decisions.

Predictive Framework

How the decisions were made was highly dependent on the discipline and problem. However, there was one element that was fundamental and common across all interviews: the early adoption of a “predictive framework” that the experts used throughout the problem-solving process. We define this framework as “a mental model of key features of the problem and the relationships between the features.” All the predictive frameworks involved some degree of simplification and approximation and an underlying level of mechanism that established the relationships between key features. The frameworks provided a structure of knowledge and facilitated the application of that knowledge to the problem at hand, allowing experts to repeatedly run “mental simulations” to make predictions for dependencies and observables and to interpret new information.

As an example, an ecologist described her predictive framework for migration, which incorporated important features such as environmental conditions and genetic differences between species and the mechanisms by which these interacted to impact the migration patterns for a species. She used this framework to guide her meta-analysis of changes in migration patterns, affecting everything from her choice of data sets to include to her interpretation of why migration patterns changed for different species. In many interviews, the frameworks used evolved as additional information was obtained, with additional features being added or underlying assumptions modified. For some problems, the relevant framework was well established and used with confidence, while for other problems, there was considerable uncertainty as to a suitable framework, so developing and testing the framework was a substantial part of the solution process.

A predictive framework contains the expert knowledge organization that has been observed in previous studies of expertise ( Egan and Greeno, 1974 ) but goes further, as here it serves as an explicit tool that guides most decisions and actions during the solving of complex problems. Mental models and mental simulations that are described in the naturalistic decision-making literature are similar, in that they are used to understand the problem and guide decisions ( Klein, 2008 ; Mosier et al. , 2018 ), but they do not necessarily contain the same level of mechanistic understanding of relationships that underlies the predictive frameworks used in science and engineering problem solving. While the use of predictive frameworks was universal, the individual frameworks themselves explicitly reflected the relevant specialized knowledge, structure, and standards of the discipline, and arguably largely define a discipline ( Wieman, 2019 ).

Discipline-Specific Variation

While the set of decisions to be made was highly consistent across disciplines, there were extensive differences within and across disciplines and work contexts, which reflected the differences in perspectives and experiences. These differences were usually evident in how experts made each of the specific decisions, but not in the choice of which decisions needed to be made. In other words, the solution methods, which included following standard accepted procedures in each field, were very different. For example, planning in some experimental sciences may involve formulating a multiyear construction and data-collection effort, while in medicine it may be deciding on a simple blood test. Some decisions, notably in categories A, D, and F, were less likely to be mentioned in particular disciplines, because of the nature of the problems. Specifically, decisions 1 (what is important in field?), 2 (opportunity fits solver’s expertise?), 27 (broader implications?), 28 (audience for communication?), and 29 (best way to present work?) were dependent on the scope of the problem being described and the expert's specific role in it. These were mentioned less frequently in interviews where the problem was assigned to the expert (most often engineering or industry) or where the importance or audience was implicit (most often in medicine). Decisions 16 (which calculations and data analysis?) and 17 (how to represent and organize info?) were particularly unlikely to be mentioned in medicine, because test results are typically provided to doctors not in the form or raw data, but rather already analyzed by a lab or other medical technology professional, so the doctors we interviewed did not need to make decisions themselves about how to analyze or represent the data. Qualitatively, we also noticed some differences between disciplines in the patterns of connections between decisions. When the problem involved development of a tool or product, most commonly the case in engineering, the interview indicated relatively rapid cycles between goals (3), framing problem/potential solutions (8), and reflection on the potential solution (26), before going through the other decisions. Biology, the experimental science most represented in our interviews, had strong links between planning (15), deciding on appropriate conclusions (21), and reflection on the solution (26). This is likely because the respective problems involved complex systems with many unknowns, so careful planning was unusually important for achieving definitive conclusions. See Supplemental Text and Supplemental Table S2 for additional notes on decisions that were mentioned at lower frequency and decisions that were likely to be interconnected, regardless of field.

This work has created a framework of decisions to characterize problem solving in science and engineering. This framework is empirically based and captures the successful problem-solving process of all experts interviewed. We see that several dozen experts across many different fields all make a common set of decisions when solving authentic problems. There are flexible linkages between decisions that are guided by reflection in a continually evolving process. We have also identified the nature of the “predictive frameworks” that S&E experts consistently use in problem solving. These predictive frameworks reveal how these experts organize their disciplinary knowledge to facilitate making decisions. Many of the decisions we identified are reflected in previous work on expertise and scientific problem solving. This is particularly true for those listed in the planning and interpreting information categories ( Egan and Greeno, 1974 ). The priority experts give to framing and planning decisions over execution compared with novices has been noted repeatedly (e.g., Chi et al. , 1988 ). Expert reflection has been discussed, but less extensively ( Chase and Simon, 1973 ), and elements of the selection and implications and communication categories have been included in policy and standards reports (e.g., AAAS, 2011 ). Thus, our framework of decisions is consistent with previous work on scientific practices and expertise, but it is more complete, specific, empirically based, and generalizable across S&E disciplines.

A limitation of this study is the small number of experts we have in total, from each discipline, and from underrepresented groups (especially lack of female representation in engineering). The lack of randomized selection of participants may also bias the sample toward experts who experienced similar academic training (STEM disciplines at U.S. universities). This means we cannot prove that there are not some experts who follow other paths in problem solving. As with any scientific model, the framework described here should be subjected to further tests and modifications as necessary. However, to our knowledge, this is a far larger sample than used in any previous study of expert problem solving. Although we see a large amount of variation both within and across disciplines in the problem-solving process, this is reflected in how experts make decisions, not in what decisions they make. The very high degree of consistency in the decisions made across the entire sample strongly suggests that we are capturing elements that are common to all experts across science and engineering. A second limitation is that decisions often overlap and co-occur in an interview, so the division between decision items is often somewhat ambiguous and could be defined somewhat differently. As noted, a number of these decisions can be interconnected, and in some fields are nearly always interconnected.

The set of decisions we have observed provides a general framework for characterizing, analyzing, and teaching S&E problem solving. These decisions likely define much of the set of cognitive skills a student needs to practice and master to perform as a skilled practitioner in S&E. This framework of decisions provides a detailed and structured way to approach the teaching and measurement of problem solving at the undergraduate, graduate, and professional training levels. For teaching, we propose using the process of “deliberate practice” ( Ericsson, 2018 ) to help students learn problem solving. Deliberate practice of problem solving would involve effective scaffolding and concentrated practice, with feedback, at making the specific decisions identified here in relevant contexts. In a course, this would likely involve only an appropriately selected set of the decisions, but a good research mentor would ensure that trainees have opportunities to practice and receive feedback on their performance on each of these 29 decisions. Future work is needed to determine whether there are additional decisions that were not identified in experts but are productive components of student problem solving and should also be practiced. Measurements of individual problem-solving expertise based on our decision list and the associated discipline-specific predictive frameworks will allow a detailed measure of an individual's discipline-specific problem-solving strengths and weaknesses relative to an established expert. This can be used to provide targeted feedback to the learner, and when aggregated across students in a program, feedback on the educational quality of the program. We are currently working on the implementation of these ideas in a variety of instructional settings and will report on that work in future publications.

As discussed in the Introduction , typical science and engineering problems fail to engage students in the complete problem-solving process. By considering which of the 29 decisions are required to answer the problem, we can more clearly articulate why. The biology problem, for example, requires students to decide on a predictive framework and access the necessary content knowledge, and they need to decide which information they need to answer the problem. However, other decisions are not required or are already made for them, such as deciding on important features and identifying anomalies. We propose that different problems, designed specifically to require students to make sets of the problem-solving decisions from our framework, will provide more effective tools for measuring, practicing, and ultimately mastering the full S&E problem-solving process.

Our preliminary work with the use of such decision-based problems for assessing problem-solving expertise is showing great promise. For several different disciplines, we have given test subjects a relevant context, requiring content knowledge covered in courses they have taken, and asked them to make decisions from the list presented here. Skilled practitioners in the relevant discipline respond in very consistent ways, while students respond very differently and show large differences that typically correlate with their different educational experiences. What apparently matters is not what content they have seen, but rather what decisions they have had practice making. Our approach was to identify the decisions made by experts, this being the task that educators want students to master. Our data do not exclude the possibility that students engage in and/or should learn other decisions as a productive part of the problem-solving process while they are learning. Future work would seek to identify decisions made at intermediate levels during the development of expertise, to identify potential learning progressions that could be used to teach problem solving more efficiently. What we have seen is consistent with previous work identifying expert–novice differences but provides a much more extensive and detailed picture of a student's strengths and weaknesses and the impacts of particular educational experiences. We have also carried out preliminary development of courses that explicitly involve students making and justifying many of these decisions in relevant contexts, followed by feedback on their decisions. Preliminary results from these courses are also encouraging. Future work will involve the more extensive development and application of decision-based measurement and teaching of problem solving.

ACKNOWLEDGMENTS

We acknowledge the many experts who agreed to be interviewed for this work, M. Flynn for contributions on expertise in mechanical engineering, and Shima Salehi for useful discussions. This work was funded by the Howard Hughes Medical Institute through an HHMI Professor grant to C.E.W.

  • ABET . ( 2020 ). Criteria for accrediting engineering programs, 2020–2021 . Retrieved November 23, 2020, from www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2020-2021 Google Scholar
  • Alberts, B., Johnson, A., Lewis, J., Morgan, D., Raff, M., Roberts, K., & Walter, P. ( 2014 ). Control of gene expression . In Molecular Biology of the Cell (6th ed., pp. 436–437). New York: Garland Science. Retrieved November 12, 2020, from https://books.google.com/books?id=2xIwDwAAQBAJ Google Scholar
  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC. Retrieved February 12, 2021, from https://visionandchange.org/finalreport Google Scholar
  • Chi, M. T. H., Glaser, R., & Farr, M. J.( ( 1988 ). The nature of expertise . Hillsdale, NJ: Erlbaum. Google Scholar
  • Crandall, B., Klein, G. A., & Hoffman, R. R. ( 2006 ). Working minds: A practitioner's guide to cognitive task analysis . Cambridge, MA: MIT Press. Google Scholar
  • Egan, D. E., & Greeno, J. G. ( 1974 ). Theory of rule induction: Knowledge acquired in concept learning, serial pattern learning, and problem solving in L . In Gregg, W. (Ed.), Knowledge and cognition . Potomac, MD: Erlbaum. Google Scholar
  • Ericsson, K. A., Charness, N., Feltovich, P. J., & Hoffman, R. R. , (Eds.) ( 2006 ). The Cambridge handbook of expertise and expert performance . Cambridge, United Kingdom: Cambridge University Press. Google Scholar
  • Ericsson, K. A., Hoffman, R. R., Kozbelt, A., & Williams, A. A. , (Eds.) ( 2018 ). The Cambridge handbook of expertise and expert performance (2nd ed.). Cambridge, United Kingdom: Cambridge University Press. Google Scholar
  • Hatano, G., & Inagaki, K. ( 1986 ). Two courses of expertise . In Stevenson, H. W.Azuma, H.Hakuta, K. (Eds.), A series of books in psychology. Child development and education in Japan (pp. 262–272). New York: Freeman/Times Books/Henry Holt. Google Scholar
  • Klein, G. ( 2008 ). Naturalistic decision making . Human Factors , 50 (3), 456–460. Medline ,  Google Scholar
  • Kozma, R., Chin, E., Russell, J., & Marx, N. ( 2000 ). The roles of representations and tools in the chemistry laboratory and their implications for chemistry learning . Journal of the Learning Sciences , 9 (2), 105–143. Google Scholar
  • Lintern, G., Moon, B., Klein, G., & Hoffman, R. ( 2018 ). Eliciting and representing the knowledge of experts . In Ericcson, K. A.Hoffman, R. R.Kozbelt, A.Williams, A. M. (Eds.), The Cambridge handbook of expertise and expert performance (2nd ed). (pp. 165–191). Cambridge, United Kingdom: Cambridge University Press. Google Scholar
  • Mosier, K., Fischer, U., Hoffman, R. R., & Klein, G. ( 2018 ). Expert professional judgments and “naturalistic decision making.” In Ericcson, K. A.Hoffman, R. R.Kozbelt, A.Williams, A. M. (Eds.), The Cambridge handbook of expertise and expert performance (2nd ed). (pp. 453–475). Cambridge, United Kingdom: Cambridge University Press. Google Scholar
  • National Research Council (NRC) . ( 2012a ). A framework for K–12 science education: Practices, crosscutting concepts, and core ideas . Washington, DC: National Academies Press. Google Scholar
  • Newell, A., & Simon, H. A. ( 1972 ). Human problem solving . Prentice-Hall. Google Scholar
  • Next Generation Science Standards Lead States . ( 2013 ). Next Generation Science Standards: For states, by states . Washington, DC: National Academies Press. Google Scholar
  • Polya, G. ( 1945 ). How to solve it: A new aspect of mathematical method . Princeton, NJ: Princeton University Press. Google Scholar
  • Quacquarelli Symonds . ( 2018 ). The global skills gap in the 21st century . Retrieved July 20, 2021, from www.qs.com/portfolio-items/the-global-skills-gap-in-the-21st-century/ Google Scholar
  • Salehi, S. ( 2018 ). Improving problem-solving through reflection (Doctoral dissertation) . Stanford Digital Repository, Stanford University. Retrieved February 18, 2021, from https://purl.stanford.edu/gc847wj5876 Google Scholar
  • Schoenfeld, A. H. ( 1985 ). Mathematical problem solving . Orlando, FL: Academic Press. Google Scholar
  • Wayne State University . ( n.d ). Mechanical engineering practice qualifying exams. Wayne State University Mechanical Engineering department . Retrieved February 23, 2021, from https://engineering.wayne.edu/me/exams/mechanics_of_materials_-_sample_pqe_problems_.pdf Google Scholar
  • Wineburg, S. ( 1998 ). Reading Abraham Lincoln: An expert/expert study in the interpretation of historical texts . Cognitive Science , 22 (3), 319–346. https://doi.org/10.1016/S0364-0213(99)80043-3 Google Scholar
  • Uncovering students’ problem-solving processes in game-based learning environments Computers & Education, Vol. 182
  • Student understanding of kinematics: a qualitative assessment 9 May 2022 | European Journal of Engineering Education, Vol. 5
  • What decisions do experts make when doing back-of-the-envelope calculations? 5 April 2022 | Physical Review Physics Education Research, Vol. 18, No. 1
  • Simulation led optical design assessments: Emphasizing practical and computational considerations in an upper division physics lecture course American Journal of Physics, Vol. 90, No. 4
  • Evidence-Based Principles for Worksheet Design The Physics Teacher, Vol. 59, No. 6

knowledge necessary for abstract problem solving

Submitted: 2 December 2020 Revised: 11 June 2021 Accepted: 23 June 2021

© 2021 A. M. Price et al. CBE—Life Sciences Education © 2021 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • BMC Med Educ

Logo of bmcmedu

Using concept mapping to evaluate knowledge structure in problem-based learning

Chia-hui hung.

1 Department of Rehabilitation, Jen-Teh Junior College of Medicine, Nursing and Management, No. 79–9, Sha-Luen Hu, Xi-Zhou Li, Hou-Loung Town, Miaoli, Taiwan

Chen-Yung Lin

2 Graduate Institute of Science Education, National Taiwan Normal University, No. 88, Ting-Jou Rd., sec. 4, Taipei, Taiwan

Many educational programs incorporate problem-based learning (PBL) to promote students’ learning; however, the knowledge structure developed in PBL remains unclear. The aim of this study was to use concept mapping to generate an understanding of the use of PBL in the development of knowledge structures.

Using a quasi-experimental study design, we employed concept mapping to illustrate the effects of PBL by examining the patterns of concepts and differences in the knowledge structures of students taught with and without a PBL approach. Fifty-two occupational therapy undergraduates were involved in the study and were randomly divided into PBL and control groups. The PBL group was given two case scenarios for small group discussion, while the control group continued with ordinary teaching and learning. Students were asked to make concept maps after being taught about knowledge structure. A descriptive analysis of the morphology of concept maps was conducted in order to compare the integration of the students’ knowledge structures, and statistical analyses were done to understand the differences between groups.

Three categories of concept maps were identified as follows: isolated, departmental, and integrated. The students in the control group constructed more isolated maps, while the students in the PBL group tended toward integrated mapping. Concept Relationships, Hierarchy Levels, and Cross Linkages in the concept maps were significantly greater in the PBL group; however, examples of concept maps did not differ significantly between the two groups.

Conclusions

The data indicated that PBL had a strong effect on the acquisition and integration of knowledge. The important properties of PBL, including situational learning, problem spaces, and small group interactions, can help students to acquire more concepts, achieve an integrated knowledge structure, and enhance clinical reasoning.

Competent practitioners in the health care professions are developed not only through the acquisition of the biomedical knowledge and clinical skills necessary to provide high-quality, effective services but also through the development of an integrated knowledge structure in an active and personal way [ 1 – 4 ]. A knowledge structure, which is the set of cognitive processes used by clinical practitioners in the diagnosis of patients, is characterized by an elaborate, highly integrated framework of related concepts [ 3 , 5 ]. Knowledge structure theory implies that both learners and experts can be influenced by their prior knowledge or underlying knowledge structures when producing diagnostic hypotheses and participating in problem-solving activities [ 6 – 8 ]. If educators can employ better ways to facilitate the development of an integrated knowledge structure than the rote memorization of facts or procedural practice, then it is likely that they will be able to promote the development of greater competence in the health care professions.

Problem-based learning (PBL) originated in Canada in the 1960s in response to dissatisfaction with the traditional didactic teaching curriculum in medical education and a perceived need for reform in the education of medical students [ 9 , 10 ]. The PBL approach, an innovative teaching and learning method utilized in medical education, may provide greater challenge and motivation by utilizing real-life scenarios to engage students by activating their prior knowledge, increasing understanding of basic science concepts, and organizing compartmental knowledge to construct a rich, elaborate, and well-integrated knowledge structure, in order to foster learning and transfer knowledge from the theoretical to the clinical context [ 11 , 12 ]. Furthermore, elaborately designed problems can stimulate self-directed learning strategies, team participation skills, information retention, and reasoning and problem-solving skills that will be available to the student after graduation [ 13 – 15 ]. If basic science and declarative knowledge can actually be converted into skills or “demonstrations”, then higher levels of performance or competency can be achieved [ 2 , 16 ]. Although PBL can enhance problem-solving and clinical reasoning skills in the health care professions [ 17 ], previous research on its superiority to the lecture-based learning (LBL) approach in the acquisition of basic science knowledge has produced inconsistent findings [ 18 – 21 ]. Research has also indicated that an integrated knowledge structure, rather than compartmentalized knowledge, is a prerequisite for successful problem-solving [ 2 , 17 ]; however, little supportive empirical evidence has been reported to show that the development of a knowledge structure is enhanced by PBL [ 2 ].

Since the original purpose of PBL was to promote deeper content learning [ 22 ], it is important to develop insights into students’ knowledge structures; however, the objective assessment methods often employed in PBL focus on bits of factual knowledge and techniques in medical problem solving and tend to value formal and routine procedural reasoning [ 7 , 8 , 23 ]. It is not sufficient to develop the knowledge and problem-solving skills in PBL, for it is important to develop higher-order thinking skills and meaningful learning with organized concepts, as opposed to the mere collection of facts [ 11 , 24 ]. Studies of health care professionals suggest that concept mapping can provide a clear representation of a student’s knowledge structure [ 14 , 15 , 25 , 26 ]. Concept mapping is a schematic device for organizing and representing a set of concepts embedded in a framework of propositions by graphically illustrating the complex processes or relationships among relevant concepts within a given subject domain [ 13 , 27 – 30 ]. Concept mapping, which was developed by Novak and Gowin based on the Ausubelian Association Theory of meaningful learning [ 30 , 31 ], can be used to show the whole knowledge structure of students. The process of learning refers to the anchoring of new ideas or concepts in previously acquired knowledge in a non-arbitrary way, thereby allowing students to differentiate concepts, integrate them into an existing knowledge structure, and form intentional effort linkages among isolated concepts by themselves [ 14 , 26 , 32 ]. In this view, lower-order concepts are linked from linear to departmental, and integrated under higher-order concepts through integrative reconciliation and progress differentiation. In integrative reconciliation, meaningful learning makes it easier for students to identify the similarities or differences between concepts, thus enabling them to take the relevant concepts and construct a superordinate concept [ 31 , 33 ]. In progress differentiation, higher-order concepts are differentiated into more elaborate and hierarchical levels in the knowledge structure [ 34 , 35 ]. Thus, concept mapping has the potential to assess the dynamic reasoning about concept relationships in students’ knowledge structures during PBL [ 36 ]. Observing the structure and details can help teachers to identify difficulties in reasoning and improve students’ higher order thinking skills [ 37 ].

Concept mapping may serve as an effective, feasible, and acceptable tool for evaluating and monitoring students’ learning in PBL [ 36 – 38 ]. Its effectiveness can be assessed with two approaches. First, concept mapping can show the formation of a knowledge structure from the basic structure to a depiction of the hierarchy and relationships among concepts [ 39 , 40 ]. Second, it can also show the high degree of coherence and connectedness within the knowledge structure that is related to the holistic morphology of construction patterns [ 41 ]. Both approaches concentrate on understanding the structure of knowledge to show the depth of thinking required in clinical reasoning [ 42 ]. These two approaches, which deal with the inner hierarchy and morphological features, were employed together in the current study to reveal the expansion and evolution of knowledge structures as a result of PBL.

This study compared the learning methods of PBL and LBL on students’ development of knowledge structures. The research question was: What patterns of concepts and their differences in the knowledge structures between the PBL and LBL groups can be identified from the use of concept maps for evaluating students’ learning achievement? A quasi-experimental method design was employed. Concept mapping was used to evaluate the effects of learning outcomes, including the patterns of concept mapping and the knowledge structure.

Participants

The study was conducted as part of the course “Assessment and Management of Brain Function: Perspectives of Occupational Therapy” in an occupational therapy program at a medical college in the Taipei area. A total of 52 occupational therapy undergraduates in their third year (20 to 21 years of age) participated in the study. None of the participants had previously been exposed to PBL or concept mapping. The students were divided into four small groups of 13 students each. The researcher randomly assigned each student to one of the four groups according to the student’s registration number. Two of the groups were assigned to the PBL experimental group, and the other two, to the LBL control group. GPower statistical software was used to calculate a sample size sufficient for a power of 0.8 as suggested by Howell [ 43 ]. Given the mean difference, standard deviation, and effect size noted in the results, the sample size of 26 participants in each group was appropriate. Although no explicit IRB approval was sought, since it was not required for educational research in Taiwan in 2010, ethical approval was granted by three occupational therapy professors outside the research team. The general principles of the Declaration of Helsinki were followed, identifying information in the data was removed to ensure anonymity, and informed consent was given by the participants.

Preparation of the PBL Program

The PBL program used in this study was based on the Maastrich 7-step PBL method [ 10 ]. During the PBL sessions, students were asked to work in collaboration with group members to analyze two cases of cognitive disability in occupational therapy. The PBL sessions took place 1 day per week over 6 weeks, and each case lasted for 3 weeks. The students followed the 7-step process to work through the PBL problems. This allowed them to explore the symptoms and clinical problems of the patients with cognitive disabilities, demonstrate clinical reasoning, make appropriate intervention decisions, and finally develop management plans. The PBL scenarios in the study presented four key features that were designed to trigger motivation, connect to prior knowledge, organize content knowledge, and evoke new learning.

The first feature was designed to trigger motivation. The initial chapter, a half-page case scenario, presented the dilemma faced by clients in their daily occupations. The scenario contained ordinary descriptions without professional terminology in order to facilitate students’ motivation and to elicit their concepts of the clinical problems.

The second feature was designed to connect concepts to prior knowledge. The subsequent two chapters in the PBL, each of which was one page in length, provided the medical history, occupational performance, and laboratory data of the clients. The aim of this stage was to develop extensive knowledge connections and a conceptual hierarchy. Students were encouraged to use professional terminology to describe symptoms and occupational therapy problems, choose an appropriate method of evaluation to identify the problems, and decide on their interventions.

The third feature was the problem space. The last chapter, one page in length, briefly described the key interventions of occupational therapy and the prognosis of the client. Several intervention techniques were roughly described as cues for possible solutions, and problem spaces were open for students to organize content knowledge, engage in discussions with group members, and develop their intervention plans.

The fourth feature was designed to link affective and attitudinal issues to facilitate learning. Ethical issues were provided in the scenarios to evoke further discussion among the students in the area of medical humanities.

Data collection

Students in both groups attended basic biomedical classes on brain function for 3 hours per week over 10 weeks. The two groups were then separately exposed to the different learning methods for 6 weeks; PBL for the PBL group, and lectures for the LBL group. The LBL group met for 3 hours once per week, during which they continued their lectures and practiced with evaluation tools, and further practice or discussion was allowed after the classes.

Before students made their concept maps, they received two hours of instruction on concept mapping. Concept mapping was carried out after each problem case in order to understand how their knowledge changed as they gradually became involved in the discussion, and students handed in their maps the following week. Two sets of 10 terms related to cognitive disability in occupational therapy were given, as follows: executive function, experience, neuropsychological evaluation, aging, compensatory strategies, routine, tabletop activity, LOTCA, problem solving, abstract thinking, attention, higher-order thinking ability, self-awareness, culture, physical evaluation, A-ONE, cognitive retraining, personality, habit, and computer-based exercise. The participants were asked to prepare their own concept maps and encouraged to add any terms that they felt necessary to complete them. Two concept maps were developed by each student, and in the end, 104 concept maps were collected.

Analytical process and methods

The analysis was conducted in two steps. The first step focused on developing a global view of the knowledge structure, and the second examined the detailed connections among concepts in the concept maps. In the first step, the maps were examined to determine their morphology, including the whole structure and the component blocks, by two teachers of occupational therapy. The two teachers identified three types of morphology: isolated mapping, departmental mapping, and integrated mapping. Chi square tests were conducted to test the homogeneity of the two groups for the three types. In the second step, a quantitative scoring protocol devised by Novak & Gowin [ 30 ] was employed to investigate the students’ concept maps. The numbers of each of the four scoring parameters, the relationships among the concepts, the levels on the map, the cross linkages, and the examples were calculated by the two teachers. Independent-samples t -tests were performed to compare the differences in the knowledge structures in terms of the concept mapping scoring parameters between the PBL and LBL groups. The intra-class correlation coefficient (ICC) was calculated to estimate the inter-rater reliability for each scoring parameter. Figure  1 shows a concept map made by one participant. In this map, the term ‘dementia’ was initially chosen as the core for the map, and then the other terms related to dementia, such as symptoms, evaluations, and interventions, were added to develop the concept map. The map depicts a rather complicated knowledge structure with multiple blocks connected by appropriate linkage words. Each block contains a number of given and additional terms that demonstrate meaningful hierarchical relationships within the knowledge structure. A corresponding example, desktop activity, was added to the end of “remedial therapy” to show the participant’s suggestion for an intervention. No cross linkage was found among the blocks on this map, however, indicating that the participant failed to clarify connections.

An external file that holds a picture, illustration, etc.
Object name is 12909_2015_496_Fig1_HTML.jpg

A map made by a participant and its scoring

Content validity

The two case scenarios and their learning objectives were developed by the authors and were subsequently reviewed by three experts in occupational therapy to determine the consistency between the case content and learning objectives. The Pearson coefficient was 0.92, indicating a high degree of correlation between the case scenarios and the PBL learning objectives.

Reliability

Reliability was established by the scores that two trained raters awarded on 29 concept maps drawn by the participants. The ICC coefficient for interrater reliability of the concept mapping scores indicated good agreement in concept relationships of 0.99 (95 % CI, 0.95–0.99); in the hierarchy, of 0.96 (95 % CI, 0.84–0.96); in levels, of 0.92 (95 % CI, 0.72–0.93); and in cross linkages, of 0.95 (95 % CI, 0.79–0.95) (Table  1 ).

95 % CI for inter-rater reliability on the scoring parameters

* p  < 0.05; 95 % CI = 95 % Confidence Interval; ICC = Intra-Class Correlation Coefficient

Analysis based on morphology

Generally speaking, constructing a concept map begins with the definition of the topic that the concept map is to address. Then key concepts related to the knowledge structure of the map are identified and listed. Those concepts are then considered and sorted in terms of their inclusiveness from general through moderate to specific. After all, a concept map is built to reveal the intended knowledge structure. Three major categories were identified based on the morphology of the maps.

Isolated mapping was typical, with several single concepts linked to the main map without reference to other associated concepts. The concepts were sometimes misplaced in the hierarchy and seemed to float outside of the main map with an arrow in the opposite direction. In addition, the concepts were less inclusive and difficult to accommodate in the knowledge structure. Figure  2 shows an isolated concept map from the LBL group. Note the morphology of the map, with four concepts on the left side connected to the core concept; however, the connecting arrow is in the opposite direction. This revealed that some concepts were isolated and difficult to progressively differentiate from a superordinate concept, and then connected to the knowledge structure. According to our analysis (Table  2 ), 15.4 % of the LBL and 7.7 % of the PBL students’ first maps were sorted into this category. In the second drawing, 11.5 % of LBL students, and no PBL students, still produced isolated maps.

An external file that holds a picture, illustration, etc.
Object name is 12909_2015_496_Fig2_HTML.jpg

An example of isolated concept mapping

Summary of chi square tests for the test of the homogeneity of the three concept mapping categories in the two groups

* p  < 0.05; Isolated = isolated mappings, Departmental = departmental mappings, Integrated = integrated mappings; PBL = problem-based learning group, LBL = lecture-based learning group

Departmental mapping refers to maps with several separated units or micro maps connected by a single arrow to superordinate concepts. A lack of cross-linkages among micro maps was characteristic of this category, in which the relationships among those micro maps could not be identified and a network-like map was not established. Figure  3 shows a departmental concept map composed of four micro maps; however, there are no cross-linkages to connect them. This reveals that acquired knowledge was included in the separate blocks; however, due to the lack of horizontal links, these concepts could not supplement one another or be mutually retrieved when utilized in a problem-solving task. In contrast to the isolated maps that often appeared in the first drawings, departmental maps were more common in the second drawings. For the first maps, 15.4 % from the LBL group and none from the PBL group were sorted as departmental maps. For the second maps, 23.1 % from the LBL group and 15.4 % from the PBL group were in this category (Table  2 ).

An external file that holds a picture, illustration, etc.
Object name is 12909_2015_496_Fig3_HTML.jpg

An example of departmental concept mapping

Integrated mapping demonstrates a good integration of concept mapping in which a superordinate concept appears at the highest level and then cross-linkages among various segments of knowledge are established to illustrate how these micro maps are related to one another. Figure  4 shows an integrated concept map which has a superordinate core concept, “Occupational Therapy of Cognitive Disability,” and then branches into two micro maps, “Assessment” and “Treatment.” Note the complexity; the map contains sufficient concept relationships to illustrate the theme, but it also has a reasonable hierarchy to show progress in the differentiation of the knowledge structure, as well as cross-linkages between and within the micro maps. Therefore, this map can be identified as a truly integrated concept map. Based on the analysis, 92.3 % of the first and 84.6 % of the second drawings of the PBL group were integrated maps, and 70.4 % of the first and 66.7 % of the second drawings of the LBL group were sorted into this category. The data indicated a trend in the PBL and LBL groups: A number of concepts were imported, the percentage of isolated and integrated maps declined, and the percentage of departmental maps increased. A test of homogeneity among the three concept mapping categories in the two groups (Table  2 ) showed that the two groups exhibited differences in terms of the concept mapping categories; that is, there was no homogeneity in the changes that took place.

An external file that holds a picture, illustration, etc.
Object name is 12909_2015_496_Fig4_HTML.jpg

An example of integrated concept mapping

Analysis of the structure of concept mapping

Table  3 shows the means and SDs of the PBL and LBL groups for the four scoring parameters. The average of the PBL group was significantly higher than that of the LBL group. Independent-samples t -tests show significant differences between the PBL and LBL groups in terms of the relationships among the concepts ( t (50) = 2.93, p =0.005; d  = 0.81; power =0.93), hierarchy levels ( t (50) = 2.25, p =0.029; d  = 0.61; power =0.86), and cross linkages ( t (50) = 2.30, p =0.026; d  = 0.62; power =0.87). Although the mean of “examples” was higher in the PBL group, the difference was not statistically significant ( t (50) = 1.14, p =0.26). The coefficients of Cohen’s d showed a medium to large effect size in the three parameters, and the power ranged from 0.86 to 0.93.

The t -test summary of concept mapping scoring levels in the PBL and LBL groups (N = 52)

* p  < 0.05; PBL = problem-based learning, LBL = lecture-based learning; M  = mean; SD  = standard deviation

a Post-hoc power analysis

The present study aimed to understand the effects of PBL on students’ knowledge structures as demonstrated by the patterns of and differences among concept maps. The central result of this study was that most of the concept maps in the PBL group eventually exhibited the integrated concept mapping pattern, which is an identifying mark of a high-quality knowledge structure. The scores on the parameters of concept mapping networks were higher in the PBL group than in the LBL group. The results of the analysis indicated that there were important properties of PBL that contributed to the students’ learning with regard to the knowledge structure.

Learning situations trigger motivation and enhance the acquisition of knowledge

Two students in the isolated map category and two students in the integrated map category in the PBL group moved into the departmental map category on the second drawing, while one student in the isolated category and one in the integrated category in the LBL group moved into the departmental category on the second drawing. Although it seemed that students performed almost the same in the PBL and LBL groups, the distribution of the three categories of concept mapping in the two groups showed that this was not the case. A close look at the chi-square test for homogeneity revealed that PBL improved the students’ performances more than did LBL. The findings demonstrated that the PBL property of situational learning with more clues helped to create a desire in students to find out more about the topic and to make that information meaningful. As a result, they were continuously expanding the limits of their knowledge and incorporating new and relevant concepts into their knowledge structures [ 13 ]. In contrast, de-contextualized learning such as LBL may result in the compartmentalization of concepts or propositions. Although these students tried to connect new concepts to the main map, they were unable to develop a map with an integrated knowledge structure due to the few cues and reminders provided during the learning process.

PBL facilitates connecting knowledge during cognitive construction

Although both student groups were engaged in learning over the same period of time, the performance of the PBL group with regard to the concepts of relationships, hierarchy levels, and cross linkages was significantly better than that of the LBL group. This difference could be a result of PBL on the activation and elaboration of previously learned knowledge, whereby the students’ reasoning skills might be enhanced. Knowledge is the underpinning of operational and thinking skills; thus, there is no skill without knowledge. In a clinical task requiring a large amount of knowledge, and especially integrated knowledge, it is suggested that concept mapping can help with visualization of the thinking process [ 13 ].

Problem space enhances conceptual differentiation and integration

According to Ausubel’s meaningful learning theory, PBL provides a holistic perspective and valid problem space for students to enhance their knowledge structure through gradual progressive differentiation and integrated reconciliation [ 31 ]. Lecture-based teaching often lacks strategies to integrate the knowledge structure and results in isolated or compartmentalized mapping. In addition, it is also important to note the function of group dynamics in PBL. PBL provides opportunities for students to present their own learning experiences and to value their peers’ perspectives; all of these experiences could help them to construct their own frameworks [ 26 ]. This may also explain why all of the scoring parameters were higher in the PBL group.

In the current study, isolated and departmental maps continued to appear in the learning process in both groups. This may be a typical result stemming from the large influx of concepts in a short learning period in both the problem scenarios and traditional lectures. As medical education emphasizes the connections and relationships between basic science and clinical knowledge, the amount of learning materials handed to the students should be given careful consideration so that the students have enough time to develop an integrated knowledge structure.

This study identified three categories of concept mapping: isolated, departmental, and integrated. It appears that PBL can help students engage in integrated concept mapping and achieve a more integrated knowledge structure. The findings also revealed that the effect of PBL on the acquisition and integration of knowledge was robust. In order to solve problems in PBL, students connect descriptive knowledge with procedural knowledge and create more details and cross linkages in their knowledge structures, which will benefit clinical reasoning in the future. The findings of this study suggest that educators aiming to enhance their students’ knowledge structures should incorporate PBL and concept mapping in the curriculum.

Limitations

Although the results of the study revealed the benefit of using concept mapping to discover the knowledge structure in PBL, our design bears the inherent limitations of the learning materials and the learning time of the students. The study clearly lacked explanations for the amount of learning materials relative to cognitive loading, and it did not explore the long term effect of concept mapping coupled with PBL. These are important issues that deserve to be addressed and explored further.

Acknowledgments

We acknowledge the enthusiastic participation of the PBL and LBL course faculty and the students in the study.

Abbreviations

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

CHH supervised the PBL session processing and accession of all the data in the study, took responsibility for the integrity of the data and the accuracy of data analysis, organized the data, and drafted and revised the article. CYL took the lead in creating the study methodology, the conception and design of the study, and the preparation of the article. Both authors contributed to the revision of the article and approved the final manuscript for publication.

Contributor Information

Chia-Hui Hung, Phone: +886-37-728855-7217, Email: moc.liamg@1211yhtacytuaeb .

Chen-Yung Lin, Email: wt.ude.untn@ycl .

IMAGES

  1. 23 Abstract Thinking Examples (2024)

    knowledge necessary for abstract problem solving

  2. Abstract Reasoning: The Key to Complex Problem Solving

    knowledge necessary for abstract problem solving

  3. 7 Steps to Improve Your Problem Solving Skills

    knowledge necessary for abstract problem solving

  4. Abstract Thinking: Definition, Benefits, & How to Improve It

    knowledge necessary for abstract problem solving

  5. The Problem Solving Method

    knowledge necessary for abstract problem solving

  6. One simple tool to help you solve any problem > Lucidity

    knowledge necessary for abstract problem solving

VIDEO

  1. Team 3 Thesis Abstract, Problem Statement,Bibliography

  2. The Disconnect Between Abstract Knowledge and Personal Impact #ai #automation #futureofwork #artific

  3. LEC03

  4. CONCRETE KNOWLEDGE & ABSTRACT KNOWLEDGE || Knowledge and Curriculum || For

  5. Art of Problem Solving: Complementary Probability Part 1

  6. Reasoning questions this solution manishkumar sir by reasoning mind test ⁉️

COMMENTS

  1. Abstract Thinking: Definition, Examples, Uses, and Tips

    Abstract thinking is a skill that is essential for the ability to think critically and solve problems. This type of thinking is also related to what is known as fluid intelligence, or the ability to reason and solve problems in unique ways. Fluid intelligence involves thinking abstractly about problems without relying solely on existing knowledge.

  2. What is abstract thinking? 10 ideas to improve your skills

    Abstract thinking is crucial for problem-solving, creativity, and critical thinking. Fortunately, there are many ways to improve these skills in your everyday life. 1. Incorporate puzzles into your life. Solving puzzles is a great way to practice abstract reasoning and exercise your brain.

  3. Abstract Thinking: How to Develop Abstract Reasoning Skills

    Abstract thinking skills allow people to exercise creativity, solve problems, and ruminate over complex problems. Abstract reasoning begins to develop during childhood and can instill lifelong critical thinking and decision-making skills. Reading stories, learning new ideas, and understanding "the big picture" all involve abstract thinking.

  4. What is Abstract Thinking? Understanding the Power of Creative Thought

    Defining Abstract Thinking. Abstract thinking is a cognitive skill that allows us to understand complex ideas, make connections between seemingly unrelated concepts, and solve problems creatively. It is a way of thinking not tied to specific examples or situations. Instead, it involves thinking about the broader significance of ideas and ...

  5. Abstract Reasoning: Everything You Need to Know When Assessing Abstract

    Abstract Reasoning is a fundamental cognitive ability that plays a crucial role in problem-solving, decision-making, and logical thinking. It is the mental capability to analyze and comprehend abstract patterns, relationships, and concepts, without the need for any prior specific knowledge. This cognitive skill enables individuals to think ...

  6. Abstract Thinking: Definition, Benefits, & How to Improve It

    Abstract thinking is the ability to think about concepts and ideas without being tied to a specific example. It is a skill that can be learned, and it is a way of approaching things from a different angle. The benefits of abstract thinking are numerous, and the more you practice it, the better you'll be.

  7. Abstract Thinking

    It is not only an essential cognitive skill but also a tool for problem-solving, decision-making, and creativity. Here are a few key areas where abstract thinking is particularly valuable: Education: Abstract thinking helps students engage in critical thinking, analyze information, and delve deeper into subjects beyond surface-level knowledge ...

  8. Knowledge as Problem Solving

    Therefore, knowledge is problem solving by the analytic method. This implies that problems are the starting points of knowledge. Thus Russell states: "In all the creative work that I have done, what has come first is a problem, a puzzle involving discomfort . Then comes concentrated voluntary thought entailing great effort" (Russell 1949 ...

  9. STEM Problem Solving: Inquiry, Concepts, and Reasoning

    Balancing disciplinary knowledge and practical reasoning in problem solving is needed for meaningful learning. In STEM problem solving, science subject matter with associated practices often appears distant to learners due to its abstract nature. Consequently, learners experience difficulties making meaningful connections between science and their daily experiences. Applying Dewey's idea of ...

  10. PDF Abstraction in Problem Solving and Learning*

    Problem solving using abstract versions of tasks ... If the problem-solving time necessary to learn a rule is to be reduced, an approach that simply abstracts the output of the normal learning algorithm ... knowledge retrieved from memory by the firing of pro-ductions. However, if this knowledge is inadequate, an

  11. Teachers' mathematical problem-solving knowledge: In what way is it

    1. Introduction. Mathematical knowledge necessary for teaching has attracted much attention in mathematics education (see, e.g., the report by Ball, 2017).The same tendency is exhibited in problem-solving, seen from the students' point of view and, more recently, the collective work of teachers accompanied by mathematics researchers and trainers (Borko & Potari, 2020).

  12. Problem solving in the mathematics curriculum: From domain‐general

    PROBLEM-SOLVING STRATEGIES AND TACTICS. While the importance of prior mathematics content knowledge for problem solving is well established (e.g. Sweller, 1988), how students can be taught to draw on this knowledge effectively, and mobilize it in novel contexts, remains unclear (e.g. Polya, 1957; Schoenfeld, 2013).Without access to teaching techniques that do this, students' mathematical ...

  13. Intelligence and Creativity in Problem Solving: The Importance of Test

    Mental imagery is fed by scenes in the environment that provide crucial visual clues for creative problem solving and actuates the need for sketching (Verstijnen et al., 2001). Creative problem solving processes often involve an interactive relationship between imagining, sketching, and evaluating the result of the sketch (van Leeuwen et al ...

  14. Real World Problem-Solving

    2.2. Analytical problem-solving. In psychology and neuroscience, problem-solving broadly refers to the inferential steps taken by an agent 4 that leads from a given state of affairs to a desired goal state (Barbey and Barsalou, 2009).The agent does not immediately know how this goal can be reached and must perform some mental operations (i.e., thinking) to determine a solution (Duncker, 1945).

  15. Engineering Problem-Solving

    Abstract. You are becoming an engineer to become a problem solver. That is why employers will hire you. Since problem-solving is an essential portion of the engineering profession, it is necessary to learn approaches that will lead to an acceptable resolution. In real-life, the problems engineers solve can vary from simple single solution ...

  16. (PDF) Abstract Reasoning and Problem-Solving Skills of First Year

    Abstract and Figures. The study ought to discuss the two of the considered most important cognitive processes, the Abstract Reasoning and Problem-Solving Skills. In general, the study hypothesized ...

  17. STEM Problem Solving: Inquiry, Concepts, and Reasoning

    Balancing disciplinary knowledge and practical reasoning in problem solving is needed for meaningful learning. In STEM problem solving, science subject matter with associated practices often appears distant to learners due to its abstract nature. Consequently, learners experience difficulties making meaningful connections between science and ...

  18. Knowledge support for problem-solving in a production process: A hybrid

    Historical codified knowledge (textual documents), i.e., experience and know-how extracted from previous problem-solving logs, can provide valuable knowledge for solving the current problem. The proposed system employs Information Retrieval (IR) techniques to extract the key concepts of relevant information necessary to handle a specific ...

  19. A Detailed Characterization of the Expert Problem-Solving Process in

    A primary goal of science and engineering (S&E) education is to produce good problem solvers, but how to best teach and measure the quality of problem solving remains unclear. The process is complex, multifaceted, and not fully characterized. Here, we present a detailed characterization of the S&E problem-solving process as a set of specific interlinked decisions. This framework of decisions ...

  20. Using concept mapping to evaluate knowledge structure in problem-based

    Since the original purpose of PBL was to promote deeper content learning , it is important to develop insights into students' knowledge structures; however, the objective assessment methods often employed in PBL focus on bits of factual knowledge and techniques in medical problem solving and tend to value formal and routine procedural ...

  21. Mathematics knowledge for understanding and problem solving

    CHAPTER 5 MATHEMATICS KNOWLEDGE FOR UNDERSTANDING AND PROBLEM SOLVING RALPH T. PUTNAM Michigan State University, Michigan, U.S.A. Abstract Two important aspects of transfer in mathematics learning are the application of mathematical knowledge to problem solving and the acquisition of more advanced concepts, both in mathematics and in other domains.

  22. PSYC EXAM 2 Mod 9

    knowledge necessary for abstract problem solving. knowledge required for reading. 7 of 20. Term. Ilana is trying to remember a set of dates for her history class. While she practices the dates, she makes up rhymes and stories to go with them. This method of improving her memory by adding to the information she is trying to remember while ...

  23. Effective social spider optimization algorithms for distributed

    The integration of manufacturing processes is an important objective of Industry 4.0. For solving this problem with the minimum makespan criterion, we introduce a three-level representation and a novel initialization method. ... by incorporating the problem specific knowledge with the social spider optimization algorithm (SSO), we propose three ...