Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

13k Accesses

10 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

collaborative problem solving de

Anger is eliminated with the disposal of a paper written because of provocation

Yuta Kanaya & Nobuyuki Kawai

collaborative problem solving de

Impact of artificial intelligence on human loss in decision making, laziness and safety in education

Sayed Fayaz Ahmad, Heesup Han, … Antonio Ariza-Montes

collaborative problem solving de

An overview of clinical decision support systems: benefits, risks, and strategies for success

Reed T. Sutton, David Pincock, … Karen I. Kroeker

Introduction

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Exploring the effects of digital technology on deep learning: a meta-analysis.

Education and Information Technologies (2024)

Sustainable electricity generation and farm-grid utilization from photovoltaic aquaculture: a bibliometric analysis

  • A. A. Amusa
  • M. Alhassan

International Journal of Environmental Science and Technology (2024)

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

collaborative problem solving de

collaborative problem solving de

Collaborative Problem Solving: What It Is and How to Do It

What is collaborative problem solving, how to solve problems as a team, celebrating success as a team.

Problems arise. That's a well-known fact of life and business. When they do, it may seem more straightforward to take individual ownership of the problem and immediately run with trying to solve it. However, the most effective problem-solving solutions often come through collaborative problem solving.

As defined by Webster's Dictionary , the word collaborate is to work jointly with others or together, especially in an intellectual endeavor. Therefore, collaborative problem solving (CPS) is essentially solving problems by working together as a team. While problems can and are solved individually, CPS often brings about the best resolution to a problem while also developing a team atmosphere and encouraging creative thinking.

Because collaborative problem solving involves multiple people and ideas, there are some techniques that can help you stay on track, engage efficiently, and communicate effectively during collaboration.

  • Set Expectations. From the very beginning, expectations for openness and respect must be established for CPS to be effective. Everyone participating should feel that their ideas will be heard and valued.
  • Provide Variety. Another way of providing variety can be by eliciting individuals outside the organization but affected by the problem. This may mean involving various levels of leadership from the ground floor to the top of the organization. It may be that you involve someone from bookkeeping in a marketing problem-solving session. A perspective from someone not involved in the day-to-day of the problem can often provide valuable insight.
  • Communicate Clearly.  If the problem is not well-defined, the solution can't be. By clearly defining the problem, the framework for collaborative problem solving is narrowed and more effective.
  • Expand the Possibilities.  Think beyond what is offered. Take a discarded idea and expand upon it. Turn it upside down and inside out. What is good about it? What needs improvement? Sometimes the best ideas are those that have been discarded rather than reworked.
  • Encourage Creativity.  Out-of-the-box thinking is one of the great benefits of collaborative problem-solving. This may mean that solutions are proposed that have no way of working, but a small nugget makes its way from that creative thought to evolution into the perfect solution.
  • Provide Positive Feedback. There are many reasons participants may hold back in a collaborative problem-solving meeting. Fear of performance evaluation, lack of confidence, lack of clarity, and hierarchy concerns are just a few of the reasons people may not initially participate in a meeting. Positive public feedback early on in the meeting will eliminate some of these concerns and create more participation and more possible solutions.
  • Consider Solutions. Once several possible ideas have been identified, discuss the advantages and drawbacks of each one until a consensus is made.
  • Assign Tasks.  A problem identified and a solution selected is not a problem solved. Once a solution is determined, assign tasks to work towards a resolution. A team that has been invested in the creation of the solution will be invested in its resolution. The best time to act is now.
  • Evaluate the Solution. Reconnect as a team once the solution is implemented and the problem is solved. What went well? What didn't? Why? Collaboration doesn't necessarily end when the problem is solved. The solution to the problem is often the next step towards a new collaboration.

The burden that is lifted when a problem is solved is enough victory for some. However, a team that plays together should celebrate together. It's not only collaboration that brings unity to a team. It's also the combined celebration of a unified victory—the moment you look around and realize the collectiveness of your success.

We can help

Check out MindManager to learn more about how you can ignite teamwork and innovation by providing a clearer perspective on the big picture with a suite of sharing options and collaborative tools.

Need to Download MindManager?

Try the full version of mindmanager free for 30 days.

How to ace collaborative problem solving

April 30, 2023 They say two heads are better than one, but is that true when it comes to solving problems in the workplace? To solve any problem—whether personal (eg, deciding where to live), business-related (eg, raising product prices), or societal (eg, reversing the obesity epidemic)—it’s crucial to first define the problem. In a team setting, that translates to establishing a collective understanding of the problem, awareness of context, and alignment of stakeholders. “Both good strategy and good problem solving involve getting clarity about the problem at hand, being able to disaggregate it in some way, and setting priorities,” Rob McLean, McKinsey director emeritus, told McKinsey senior partner Chris Bradley  in an Inside the Strategy Room podcast episode . Check out these insights to uncover how your team can come up with the best solutions for the most complex challenges by adopting a methodical and collaborative approach. 

Want better strategies? Become a bulletproof problem solver

How to master the seven-step problem-solving process

Countering otherness: Fostering integration within teams

Psychological safety and the critical role of leadership development

If we’re all so busy, why isn’t anything getting done?

To weather a crisis, build a network of teams

Unleash your team’s full potential

Modern marketing: Six capabilities for multidisciplinary teams

Beyond collaboration overload

MORE FROM MCKINSEY

Take a step Forward

Jump to content

Home

Psychiatry Academy

Bookmark/search this post.

Facebook logo

You are here

Parenting, teaching and treating challenging kids: the collaborative problem solving approach.

collaborative problem solving de

  • CE Information
  • Register/Take course

Think:Kids and the Department of Psychiatry at Massachusetts General Hospital are pleased to offer an online training program featuring Dr. J. Stuart Ablon. This introductory training provides a foundation for professionals and parents interested in learning the evidence-based approach to understanding and helping children and adolescents with behavioral challenges called Collaborative Problem Solving (CPS). This online training serves as the prerequisite for our professional intensive training.

The CPS approach provides a way of understanding and helping kids who struggle with behavioral challenges. Challenging behavior is thought of as willful and goal oriented which has led to approaches that focus on motivating better behavior using reward and punishment programs. If you’ve tried these strategies and they haven’t worked, this online training is for you! At Think:Kids we have some very different ideas about why these kids struggle. Research over the past 30 years demonstrates that for the majority of these kids, their challenges result from a lack of crucial thinking skills when it comes to things like problem solving, frustration tolerance and flexibility. The CPS approach, therefore, focuses on helping adults teach the skills these children lack while resolving the chronic problems that tend to precipitate challenging behavior.

This training is designed to allow you to learn at your own pace. You must complete the modules sequentially, but you can take your time with the content as your schedule allows. Additional resources for each module provide you with the opportunity for further development. Discussion boards for each module allow you to discuss concepts and your own experiences with other participants. Faculty from the Think:Kids program monitor the boards and offer their point of view.

Registrants will have access to course materials from the date of their registration through the course expiration date.

All care Providers: $149 Due to COVID-19, we are offering this course at the reduced rate of $99 for a limited time.

NOTE: If you are paying for your registration via Purchase Order, please send the PO to [email protected] . Our customer service agent will respond with further instructions.

Cancellation Policy

Refunds will be issued for requests received within 10 business days of purchase, but an administrative fee of $35 will be deducted from your refund. No refunds will be made thereafter. Additionally, no refunds will be made for individuals who claim CME or credit, regardless of when they request a refund.

Through the duration of the course, the faculty moderator will respond to any clinical questions that are submitted to the interactive discussion board. The faculty moderator for this course will be:

J. Stuart Ablon, PhD

*** Please note that discussion boards are reviewed on a regular basis, and responses to all questions will be posted within one week of receipt. ***

Target Audience

This program is intended for: Parents, clinicians, educators, allied mental health professionals, and direct care staff.

Learning Objectives

At the end of this program, participants will be able to:

  • Shift thinking and approach to foster positive relationships with children
  • Reduce challenging behavior
  • Foster proactive, rather than reactive interventions
  • Teach skills related to self-regulation, communication and problem solving

MaMHCA, and its agent, MMCEP has been designated by the Board of Allied Mental Health and Human Service Professions to approve sponsors of continuing education for licensed mental health counselors in the Commonwealth of Massachusetts for licensure renewal, in accordance with the requirements of 262 CMR 3.00.

This program has been approved for 3.00 CE credit for Licensed Mental Health Counselors MaMHCA.

Authorization number: 17-0490

The Collaborative of NASW, Boston College, and Simmons College Schools of Social Work authorizes social work continuing education credits for courses, workshops, and educational programs that meet the criteria outlined in 258 CMR of the Massachusetts Board of Registration of Social Workers

This program has been approved for 3.00 Social Work Continuing Education hours for relicensure, in accordance with 258 CMR. Collaborative of NASW and the Boston College and Simmons Schools of Social Work Authorization Number D 61675-E

This course allows other providers to claim a Participation Certificate upon successful completion of this course.

Participation Certificates will specify the title, location, type of activity, date of activity, and number of AMA PRA Category 1 Credit™ associated with the activity. Providers should check with their regulatory agencies to determine ways in which AMA PRA Category 1 Credit™ may or may not fulfill continuing education requirements. Providers should also consider saving copies of brochures, agenda, and other supporting documents.

The Massachusetts General Hospital Department of Psychiatry is approved by the American Psychological Association to sponsor continuing education for psychologists. The Massachusetts General Hospital Department of Psychiatry maintains responsibility for this program and its content.

This offering meets the criteria for 3.00 Continuing Education (CE) credits per presentation for psychologists.

Stuart Ablon, PhD

Available credit.

Logo

  • Collaborative Problem Solving in Schools »

Collaborative Problem Solving in Schools

Collaborative Problem Solving ® (CPS) is an evidence-based, trauma-informed practice that helps students meet expectations, reduces concerning behavior, builds students’ skills, and strengthens their relationships with educators.

Collaborative Problem Solving is designed to meet the needs of all children, including those with social, emotional, and behavioral challenges. It promotes the understanding that students who have trouble meeting expectations or managing their behavior lack the skill—not the will—to do so. These students struggle with skills related to problem-solving, flexibility, and frustration tolerance. Collaborative Problem Solving has been shown to help build these skills.

Collaborative Problem Solving avoids using power, control, and motivational procedures. Instead, it focuses on collaborating with students to solve the problems leading to them not meeting expectations and displaying concerning behavior. This trauma-informed approach provides staff with actionable strategies for trauma-sensitive education and aims to mitigate implicit bias’s impact on school discipline . It integrates with MTSS frameworks, PBIS, restorative practices, and SEL approaches, such as RULER. Collaborative Problem Solving reduces challenging behavior and teacher stress while building future-ready skills and relationships between educators and students.

Transform School Discipline

Traditional school discipline is broken, it doesn’t result in improved behavior or improved relationships between educators and students. In addition, it has been shown to be disproportionately applied to students of color. The Collaborative Problem Solving approach is an equitable and effective form of relational discipline that reduces concerning behavior and teacher stress while building skills and relationships between educators and students. Learn more >>

A Client’s Story

CPS SEL

Collaborative Problem Solving and SEL

Collaborative Problem Solving aligns with CASEL’s five core competencies by building relationships between teachers and students using everyday situations. Students develop the skills they need to prepare for the real world, including problem-solving, collaboration and communication, flexibility, perspective-taking, and empathy. Collaborative Problem Solving makes social-emotional learning actionable.

Collaborative Problem Solving and MTSS

The Collaborative Problem Solving approach integrates with Multi-Tiered Systems of Support (MTSS) in educational settings. CPS benefits all students and can be implemented across the three tiers of support within an MTSS framework to effectively identify and meet the diverse social emotional and behavioral needs of students in schools. Learn More >>

CPS and MTSS

The Results

Our research has shown that the Collaborative Problem Solving approach helps kids and adults build crucial social-emotional skills and leads to dramatic decreases in behavior problems across various settings. Results in schools include remarkable reductions in time spent out of class, detentions, suspensions, injuries, teacher stress, and alternative placements as well as increases in emotional safety, attendance, academic growth, and family participation.

Academic growth

Educators, join us in this introductory course and develop your behavioral growth mindset!

This 2-hour, self-paced course introduces the principles of Collaborative Problem Solving ®  while outlining how the approach is uniquely suited to the needs of today's educators and students. Tuition: $39 Enroll Now

Bring CPS to Your School

We can help you bring a more accurate, compassionate, and effective approach to working with children to your school or district.

What Our Clients Say

Education insights, corporal punishment ban in new york sparks awareness of practice, to fix students’ bad behavior, stop punishing them, behaviors charts: helpful or harmful, bringing collaborative problem solving to marshalltown, ia community school district, the benefits of changing school discipline, eliminating the school-to-prison pipeline, ending restraint and seclusion in schools: podcast, a skill-building approach to reducing students’ anxiety and challenging behavior, the school discipline fix book club, what can we do about post-pandemic school violence, sos: our schools are in crisis and we need to act now, talking to kids about the tiktok bathroom destruction challenge, north dakota governor’s summit on innovative education 2021, kids of color suffer from both explicit and implicit bias, school discipline is trauma-insensitive and trauma-uninformed, privacy overview.

collaborative problem solving de

Collaborative problem solvers are made not born – here’s what you need to know

collaborative problem solving de

Professor of Cognitive Sciences, University of Central Florida

Disclosure statement

Stephen M. Fiore has received funding from federal agencies such as NASA, ONR, DARPA, and the NSF to study collaborative problem solving and teamwork. He is past president of the Interdisciplinary Network for Group Research, currently a board member of the International Network for the Science of Team Science, and a member of DARPA's Information Science and Technology working group.

View all partners

Challenges are a fact of life. Whether it’s a high-tech company figuring out how to shrink its carbon footprint, or a local community trying to identify new revenue sources, people are continually dealing with problems that require input from others. In the modern world, we face problems that are broad in scope and great in scale of impact – think of trying to understand and identify potential solutions related to climate change, cybersecurity or authoritarian leaders.

But people usually aren’t born competent in collaborative problem-solving. In fact, a famous turn of phrase about teams is that a team of experts does not make an expert team . Just as troubling, the evidence suggests that, for the most part, people aren’t being taught this skill either. A 2012 survey by the American Management Association found that higher level managers believed recent college graduates lack collaboration abilities .

Maybe even worse, college grads seem to overestimate their own competence. One 2015 survey found nearly two-thirds of recent graduates believed they can effectively work in a team, but only one-third of managers agreed . The tragic irony is that the less competent you are, the less accurate is your self-assessment of your own competence. It seems that this infamous Dunning-Kruger effect can also occur for teamwork.

Perhaps it’s no surprise that in a 2015 international assessment of hundreds of thousands of students, less than 10% performed at the highest level of collaboration . For example, the vast majority of students could not overcome teamwork obstacles or resolve conflict. They were not able to monitor group dynamics or to engage in the kind of actions needed to make sure the team interacted according to their roles. Given that all these students have had group learning opportunities in and out of school over many years, this points to a global deficit in the acquisition of collaboration skills.

How can this deficiency be addressed? What makes one team effective while another fails? How can educators improve training and testing of collaborative problem-solving? Drawing from disciplines that study cognition, collaboration and learning, my colleagues and I have been studying teamwork processes. Based on this research, we have three key recommendations.

collaborative problem solving de

How it should work

At the most general level, collaborative problem-solving requires team members to establish and maintain a shared understanding of the situation they’re facing and any relevant problem elements they’ve identified. At the start, there’s typically an uneven distribution of knowledge on a team. Members must maintain communication to help each other know who knows what, as well as help each other interpret elements of the problem and which expertise should be applied.

Then the team can get to work, laying out subtasks based upon member roles, or creating mechanisms to coordinate member actions. They’ll critique possible solutions to identify the most appropriate path forward.

Finally, at a higher level, collaborative problem-solving requires keeping the team organized – for example, by monitoring interactions and providing feedback to each other. Team members need, at least, basic interpersonal competencies that help them manage relationships within the team (like encouraging participation) and communication (like listening to learn). Even better is the more sophisticated ability to take others’ perspectives, in order to consider alternative views of problem elements.

Whether it is a team of professionals in an organization or a team of scientists solving complex scientific problems , communicating clearly, managing conflict, understanding roles on a team, and knowing who knows what – all are collaboration skills related to effective teamwork.

What’s going wrong in the classroom?

When so many students are continually engaged in group projects, or collaborative learning, why are they not learning about teamwork? There are interrelated factors that may be creating graduates who collaborate poorly but who think they are quite good at teamwork.

I suggest students vastly overestimate their collaboration skills due to the dangerous combination of a lack of systematic instruction coupled with inadequate feedback. On the one hand, students engage in a great deal of group work in high school and college. On the other hand, students rarely receive meaningful instruction, modeling and feedback on collaboration . Decades of research on learning show that explicit instruction and feedback are crucial for mastery .

Although classes that implement collaborative problem-solving do provide some instruction and feedback, it’s not necessarily about their teamwork. Students are learning about concepts in classes; they are acquiring knowledge about a domain. What is missing is something that forces them to explicitly reflect on their ability to work with others.

When students process feedback on how well they learned something, or whether they solved a problem, they mistakenly think this is also indicative of effective teamwork. I hypothesize that students come to conflate learning course content material in any group context with collaboration competency.

collaborative problem solving de

A prescription for better collaborators

Now that we’ve defined the problem, what can be done? A century of research on team training , combined with decades of research on group learning in the classroom , points the way forward. My colleagues and I have distilled some core elements from this literature to suggest improvements for collaborative learning .

First, most pressing is to get training on teamwork into the world’s classrooms. At a minimum, this needs to happen during college undergraduate education, but even better would be starting in high school or earlier. Research has demonstrated it’s possible to teach collaboration competencies such as dealing with conflict and communicating to learn. Researchers and educators need, themselves, to collaborate to adapt these methods for the classroom.

Secondly, students need opportunities for practice. Although most already have experience working in groups, this needs to move beyond science and engineering classes. Students need to learn to work across disciplines so after graduation they can work across professions on solving complex societal problems.

Third, any systematic instruction and practice setting needs to include feedback. This is not simply feedback on whether they solved the problem or did well on learning course content. Rather, it needs to be feedback on interpersonal competencies that drive successful collaboration. Instructors should assess students on teamwork processes like relationship management, where they encourage participation from each other, as well as skills in communication where they actively listen to their teammates.

Even better would be feedback telling students how well they were able to take on the perspective of a teammate from another discipline. For example, was the engineering student able to take the view of a student in law and understand the legal ramifications of a new technology’s implementation?

My colleagues and I believe that explicit instruction on how to collaborate, opportunities for practice, and feedback about collaboration processes will better prepare today’s students to work together to solve tomorrow’s problems.

  • Decision making
  • Cooperation
  • Problem solving
  • Collaboration
  • Dunning-Kruger effect
  • Wicked problems
  • student collaboration
  • College graduates
  • 21st century skills
  • Group decision making
  • Collaborative problem solving

collaborative problem solving de

Faculty of Law - Academic Appointment Opportunities

collaborative problem solving de

Operations Manager

collaborative problem solving de

Senior Education Technologist

collaborative problem solving de

Audience Development Coordinator (fixed-term maternity cover)

collaborative problem solving de

Lecturer (Hindi-Urdu)

Collaborative and Proactive Solutions

PROVIDER MENU

“Kids do well if they can.”

—Ross Greene, Ph.D.

Say goodbye to conflict, screaming, spankings, detentions, suspensions, de-escalating, restraint, and seclusion. Say hello to solving problems collaboratively and proactively.

What is collaborative & proactive solutions.

Collaborative & Proactive Solutions (CPS) is an evidence-based model of psychosocial treatment originated and developed by Dr. Ross Greene , and described in his books The Explosive Child , Lost at School , Raising Human Beings , and Lost & Found .

student raising hand

What does CPS do?

Rather than focusing on kids’ concerning behaviors (and modifying them), CPS helps kids and caregivers solve the problems that are causing those behaviors. The problem solving is collaborative (not unilateral) and proactive (not reactive). Research has shown that the model is effective not only at solving problems and improving behavior but also at enhancing skills.

Where has CPS been implemented?

In countless families, general and special education schools, group homes, inpatient psychiatry units, and residential and juvenile detention facilities, the CPS model has been shown to dramatically reduce concerning behavior and dramatically reduce or eliminate discipline referrals, detentions, suspensions, and the use of restraint and seclusion .

classroom

How do you get the ball rolling?

This website connects you to vast array of resources, including a variety of learning and training options and over 200 providers in 16 different countries. And you’ll find lots of additional resources—including the research supporting the effectiveness of the model—on the website of the non-profit,  Lives in the Balance .

We are also happy to discuss your specific needs… CONTACT US

Collaborative and Proactive Solutions™

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

Collaborative Problem Solving: Processing Actions, Time, and Performance

Paul de boeck.

1 Department of Psychology, The Ohio State University, Columbus, OH, United States

2 Department of Psychology, KU Leuven, Leuven, Belgium

Kathleen Scalise

3 Department of Educational Methodology, Policy, and Leadership, University of Oregon, Eugene, OR, United States

Associated Data

This study is based on one collaborative problem solving task from an international assessment: the Xandar task. It was developed and delivered by the Organization for Economic Co-operation and Development Program for International Student Assessment (OECD PISA) 2015. We have investigated the relationship of problem solving performance with invested time and number of actions in collaborative episodes for the four parts of the Xandar task. The parts require the respondent to collaboratively plan a process for problem solving, implement the process, reach a solution, and evaluate the solution (For a full description, see the Materials and Methods section, “Parts of the Xandar Task.”) Examples of an action include posting to a chat log, accessing a shared resource, or conducting a search on a map tool. Actions taken in each part of the task were identified by PISA and recorded in the data set numerically. A confirmatory factor analysis (CFA) model looks at two types of relationship: at the level of latent variables (the factors) and at extra dependencies, which here are direct effects and correlated residuals (independent of the factors). The model, which is well-fitting, has three latent variables: actions (A), times (T), and level of performance (P). Evidence for the uni-dimensionality of performance level is also found in a separate analysis of the binary items. On the whole for the entire task, participants with more activities are less successful and faster, based on the United States data set employed in the analysis. By contrast, successful participants take more time. By task part, the model also investigates relationships between activities, time, and performance level within the parts. This was done because one can expect dependencies within parts of such a complex task. Results indicate some general and some specific relationships within the parts, see the full manuscript for more detail. We conclude with a discussion of what the investigated relationships may reveal. We also describe why such investigations may be important to consider when preparing students for improved skills in collaborative problem solving, considered a key aspect of successful 21st century skills in the workplace and in everyday life in many countries.

Introduction

The construct explored here, collaborative problem solving (CPS), was first introduced to the Program for International Student Assessment (PISA) in 2015. Attempts to explore process data collected in complex activities such as CPS are emerging rapidly in education. Yet which models might best fit process data and the analytic techniques to employ to investigate patterns in the data are not well understood at this time. So here we investigate whether relationships seen in the actions taken by PISA respondents, as coded by PISA, might shed light on approaches for modeling complex CPS tasks.

In the CPS task released by PISA, the Xandar task, there are four parts. The parts of the task require the respondent to collaborate to plan a process for problem solving, implement the process, reach a solution, and evaluate the solution. (For a full description of these parts, see the Materials and Methods section, “Parts of the Xandar Task.”) Examples of actions in Part 1, for instance, include posting to a chat log, accessing a shared resource, or conducting a search on a shared map tool.

In each of the parts, process data are available on time spent and number of actions, as well as on the performance on specific items within the four parts. We explore modeling these Xandar data to address three research questions:

  • simple RQ1. Does a factor model employing process data (actions and time) support evidence for a latent variable differentiation between the types of process data (actions, time) and between the latter two and quality of performance? The expected latent variables are Actions, Time, and Performance.
  • simple RQ2. Do extra dependencies at the level of the observed variables improve model fit, including direct effects and correlated residuals (independent of the factors)? If they do, they reveal direct relationships between process aspects and performance, independent of the latent variables. These direct relationships are indications of the dynamics underlying collaborative problem solving, whereas the latent variables and their correlations inform us about global individual differences in process approaches and performance.
  • simple RQ3. Can the performance also be considered as uni-dimensional at the specific level of the individual items (from all four Xandar parts)?

In this Xandar investigation, each factor (latent variable) is composed of four corresponding measures from the four Xandar parts. Data are fit with a latent variable model to answer RQ1. Dependencies within parts can be expected between the three measures. So we address the extra dependencies in RQ2. The dependencies are not only considered for methodological reasons when variables stem from the same part, but they may also reveal how subjects work on the tasks. Finally, because a good-fitting factor model would imply uni-dimensionality of the performance sum scores from the four parts, we also explore uni-dimensionality at the level of the individual items in RQ3.

Sections in this paper first discuss the PISA efforts to explore problem solving in 2012 and 2015 assessments, then offer a brief summary of the literature on CPS. Next in the Materials and Methods section, we discuss the PISA 2015 collaborative complex problem solving released task, “Xandar,” including the availability of the released code dictionary and data set. In the Results and Discussion, we model United States data from the Xandar task and report results to address the three research questions.

PISA and a Brief Summary of Literature on CPS

The PISA 2015 CPS construct, which included measuring groups in collaboration, was built on PISA’s 2012 conception of individual problem solving ( OECD, 2014 ). In PISA 2012, some student individual characteristics related to individual problem solving were measured. These measures were openness to learning, perseverance, and problem solving strategies.

For the 2015 PISA collaborative framework ( OECD, 2013 ), the construct of problem solving was extended from 2012 in order to include measures of group collaboration. For this new assessment in 2015, it was recognized that the ability of an individual to be successful in many modern situations involves participating in a group. Collaboration was intended to include such challenges as communicating within the group, managing conflict, organizing a group, and building consensus, as well as managing progress on a successful solution.

The PISA framework described the importance of improving collaboration skills for students ( Rummel and Spada, 2005 ; Vogel et al., 2016 ) The measurement of collaboration skills was at the heart of problem solving competencies in the PISA CPS 2015 framework. The framework specified first that the competency being described remained the capacity of an individual, not the group. Secondly, the respondent must effectively engage in a process whereby two or more agents attempt to solve a problem, where the agents can be people or simulations. Finally, the collaborators had to show efficacy by sharing the understanding and effort required to come to a solution, such as pooling knowledge to reach solutions.

Approaches to gathering assessment evidence cited by the PISACPS framework ( OECD, 2013 ) ranged from allowing actions during collaboration to evaluating the results from collaboration. Measures of collaboration in the research literature include solution success, as well as processes during the collaboration ( Avouris et al., 2003 ). In situ observables for such assessments could include analyses of log files in which the computer keeps a record of student activities, sets of intermediate results, and paths taken along the way ( Adejumo et al., 2008 ). Group interactions also offer relevant information ( O’Neil et al., 1997 ), including quality and type of communication ( Cooke et al., 2003 ; Foltz and Martin, 2008 ; Graesser et al., 2008 ) and judgments ( McDaniel et al., 2001 ).

The international Assessment and Teaching for twenty-first century Skills (ATC21S) project also examined the literature on disposition to collaboration and to problem solving in online environments. ATC21S described how interface design feature issues and the evaluation of CPS processes interact in the online collaboration setting ( Scalise and Binkley, 2009 ; Binkley et al., 2010 , 2012 ).

In the PISA 2015 CPS assessment, a student’s collaborative problem-solving ability is assessed in scenarios where the student must solve a problem. For collaboration, the problem is solving working with “agents,” or computer avatars that simulate collaboration. The CPS framework describes that a problem need not be subject-matter specific task,. Rather it could also be as a partial task in an everyday problem. Examples of subject-matter specific problem solving include setting up a sustainable fish farm in science, planning the construction of a bridge using engineering and mathematics, or writing a persuasive letter using language arts and literacy Examples of an “everyday” problem include communicating with others to delegate roles during collaboration for event planning, monitoring to ensure a group remains on task, and evaluating whether collaboration is complete. All these actions can be directed toward the ultimate goal.

In the PISA 2015 perspective, assessment is continuous throughout the unit and can incorporate student’s interactions with the digital agents. Each student response on a traditional question follows a stream of actions during which the student has chosen how to interact and collaborate with standardized agents in each particular task situation. Very few of the collaborative actions and tasks are released by PISA, but the number of collaborative actions in each part of the task are released and made available in the PISA data sets. So here we accept that PISA has coded the action as taking place, and analyze the numeric results provided.

Materials and Methods

Parts of the xandar task.

Here we analyze numeric data provided for the PISA 2015 Xandar unit ( OECD, 2017a , 2017b ). In the unit Xandar:

“A three-person team consisting of the student test-taker and two computer agents takes part in a contest where [the team] must answer questions about the fictional country of Xandar. The questions [involve] Xandar’s geography, people and economy. This unit involves decision-making and coordination tasks, requires consensus-building collaboration, and has an in-school, private, and non-technology-based context.”

Xandar is a fictional planet appearing in comic books published by Marvel Comics. In the PISA Xandar task, it is treated as a mythical location to be investigated collaboratively. The Xandar task has four parts:

  • • Part 1 – Agreeing on a Strategy. This part of the Xandar activity familiarizes the student with how the contest will proceed, the chat interface and the task space including buttons that students can click to take actions in particular situations and a scorecard that monitors team progress. In Part 1, the student is assigned to work in a team with digital agents named Alice and Zach. A variety of actions are available. The respondent and the agents interact to generate a stream of actions. The respondent is expected to follow the rules of engagement provided for the contest and to effectively establish collaborative and problem-solving strategies that were the goal of Part 1.
  • • Part 2 – Reaching a Consensus Regarding Preferences. In this part of the Xandar activity, group members should take responsibility for the contest questions in one subject area (Xandar’s geography, people, or economy). The team members must apportion the subject areas among themselves. The agents begin by disagreeing. The student has opportunities to help resolve the disagreement, can take a variety of actions, and the goal is to establish common understanding.
  • • Part 3 – Playing the Game Effectively. In this part of the Xandar activity, group members begin playing the game by answering geography contest questions together. The group has the opportunity to choose among answers, during which the agents interject questions, pose concerns and even violate game rules. The student exhibits collaborative problem solving strategies through actions and responses.
  • • Part 4 – Assessing Progress. In this part of the Xandar activity, agent Alice has posed a question about its progress. The student responds with an evaluation. Regardless of the student’s answer, agent Zach indicates he is experiencing trouble information foraging for his assigned subject area, economy. Responses and actions take place regarding both evaluating and supporting group members.

Each of the four parts comes with a number of items to score the performance. The complete Xandar released task is presented in an OECD PISA report that illustrates the items that students faced in the 2015 PISA collaborative problem-solving assessment ( OECD, 2016 ). The released code dictionary and data are also available on the 2015 PISA website. We do not repeat the Xandar information here (due in part to copyright), but summarize only. The Xandar released unit presents:

  • • a screenshot of each item
  • • the correct action(s) or response to the item
  • • an explanation as to why the action or response is correct
  • • the skills that are examined by the item
  • • alignments describing the difficulty of the item.

As described earlier, this study employed data publicly released from the Organization for Economic Co-operation and Development Program for International Student Assessment (OECD PISA) for the optional collaborative problem solving (CPS) assessment. It was administered in 2015 to nationally representative samples of approximately age 15 students. Since PISA is designed to have systematically missing data in a matrix sample, only students who took the Xandar task were included. Students were sampled according to the PISA sample frame. Data analyzed here are representatively sampled United States participants from the Xandar released task. See Table 1 for descriptives by age, gender and race/ethnicity of the United States Xandar task sample used.

Descriptives for collaborative problem solving Xandar assessment for the United States sample.

From the 994 students who took the Xandar task, 986 have complete Xandar data. The descriptive statistics and all analyses are based on N = 986. (Note that limitations to be discussed later in this manuscript include only United States data examined to date in this exploration. Extensions to more countries and comparisons across countries are an exciting and interesting potential to the work. However, the international extensions are out of scope for this article.) For the purposes of the current study, the school variable was not employed. All students were treated as one group.

Regarding ethical approval and consent for human subjects data collection in PISA, OECD gains ethical approval and consent through PISA processes. Processes are established in coordination with each country for public release of some de-identified data collected in PISA main study assessments. Data sets made available for release are intended for purposes of secondary research. The CPS data set used here is available through the OECD data repository website 1 .

As discussed earlier, for the Xandar task, released data are available for actions, time and level of performance. The data for the current study included four indicators each of CPS actions taken (parts 1–4), time taken (parts 1–4), and success scores (parts 1–4). These become the three latent traits, or factors, in this study. To measure CPS actions, we used number of collaboration actions as measured by the data provided in the log transformation of C1A, C2A, C3A, and C4A. “C” indicates this was a collaborative assessment, the numeral indicates the Xandar part, and “A” indicates number of actions taken. To measure timing, we used timing as measured by data provided in the log transformation of C1T, C2T, C3T, and C4T. “C” indicates this was a collaborative assessment, the numeral indicates the Xandar part, and T indicates time taken. To measure student success, we used the sum of the binary item response success scores for each of the four parts, C1P, C2P, C3P, and CP4 (based on 5, 3, 2, and 2 items within the Xandar parts).

Exploratory data analysis following log transformation as described above for some variables revealed only minor deviations from normality. Skewness between −2 to 2 was used for all observed variables ( Cohen et al., 2002 ). Note, however, that this is not a strongly conservative range, as discussed in the limitations. So we also report for this study skewness with all observed variables approximately in the range −1 to 1 except for C1A (1.52) and C2A (1.48). Due to no major levels of deviation, the analysis proceeded without further transformation to the observed variables. Other descriptives for all observed variables are provided in Table 2 .

Descriptives for observed variables.

We fit the model using lavaan ( Rosseel, 2012 ) in R version 3.5.1 ( R Core Team, 2018 ). We used the weighted least squares “WLSMV” option which employs the diagonally weighted least squares (DWLS) estimator with robust standard errors and a mean and variance adjusted test statistic. We have estimated a confirmatory factor analysis (CFA) model with three factors (each with standardized latent variables). The factors are Actions, Time, and Performance. Each one has the four corresponding measures from the four Xandar parts.

Because dependencies within parts can be expected between the three measures, some parameters were added to the model. They are direct within-part effects of actions on time (more actions implies more time), direct within-part effects of performance on time (better performance may take more time), and correlated residuals for actions and performance within each part (exploring the relationship between actions and performance level).

Direct effects and residual correlations are two different types of dependencies. Direct effects are effects of one variable on another (e.g., of Y 1 on Y 2 ). The two directions, Y 1 → Y 2 and Y 2 → Y 1 , are not mathematically equivalent. Correlated residuals are equivalent with the effect of a residual of one variable on the other variable (e.g., of ε Y ⁢ 1 on Y 2 ). the two directions are mathematically equivalent and equivalent with the covariance of the residuals. To be clear, neither of the dependencies prove a causality relation. A causal hypothesis can be at the basis of hypothesizing a direct effect, whereas correlated residuals can be used for explorative purposes, without specifying a direction. For the present study, we hypothesized that more actions take more time and that a higher level of performance requires more time. For number of actions and level of performance we explore the dependency with correlated residuals.

See the row heads of Tables 3 , ​ ,4 4 and Figure 1 for a definition of the model estimated. It includes the latent variable structure as well as the dependencies. The model can also be derived from the R code for the analysis, which is available in the Supplementary Material .

CFA factor loadings Xandar measures.

Extra dependencies in CFA model for Xandar measures.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-10-01280-g001.jpg

Latent variable and dependency model for Xandar data. The latent variables are Time, Actions, and Performance. The observed variables per factor are indicated with capital letters referring to the latent variable (T, A, P) and with a number referring to the Xandar part (1, 2, 3, and 4). The direct effects between observed variables from the same Xandar part are indicated with single headed dashed arrows (between the A and T and between the P and T). The correlated residuals are indicated with dotted lines without arrow. Significance ( p <–01) is denoted with a thicker dashed arrow (direct effects) or line (correlated residuals). All dependencies are positive except when indicated with “neg” (between Al and PI). Correlations between latent variables, factor loadings, residual variances, and dependency values are omitted to avoid clutter in the figure. The correlations between the latent variables can be found in the text, the factor loadings are presented in Table 3 , and the dependency values in Table 4 .

In this section we describe the results of the modeling. With the dependencies as described in the Methods section added to the model, the model fit was good (close), with a TLI of 0.95 and RMSEA of 0.038 (90% CI 0.029 to 0.048). Without the dependencies (without the eight direct effects and four residual correlations), the model fit is clearly worse, with a TLI of 0.574 and RMSEA of 0.112 (90% CI 0.104 to 0.119). These results address RQ1 and RQ2.

The correlations between the latent variables are −0.473, p <0.001 (Actions and Time), −0.732, p < 0.001 (Actions and Performance), and 0.190, p < 0.01 (Time and Performance). The loadings and dependencies are shown in Tables 3 , ​ ,4, 4 , respectively. As expected, the indicators of actions, time, and performance all showed significant positive factor loadings on the corresponding factors (see Table 3 ). The standardized coefficients in the last column indicate that the loadings of the Part 4 indicators are lower than those of the other three parts: 0.19 (Actions), 0.43 (Time), and 0.38 (Performance).

Table 4 shows the estimates of the dependencies:

  • • Number of activities makes time longer: a significant positive effect was found for all four parts.
  • • A significant positive effect of performance on time was found only for Part 4. For the other parts the effect was almost zero.
  • • Number of activities and performance levels have significant correlated residuals for two parts. For explorative reasons the dependencies were not tested with a direction but with correlated residuals instead. The results were found to be different depending on the part. Results showed negative dependency for Part 1, a positive dependency for Part 4, and an almost zero dependency for the Parts 2 and 3.

Although the factor model with these dependencies fits well, we wanted to check whether the performance is also uni-dimensional at the level of the individual items (RQ3). Uni-dimensionality of the four sum scores as implied by the factor model, does not imply uni-dimensionality at the level of the 12 individual binary items. This is especially because the items represent four processes (exploring and understanding, representing and formulating, planning and executing, and monitoring and reflecting) and three competencies (establishing and maintaining shared understanding, taking appropriate action to solve the problem, and establishing and maintaining team organization), but not with a perfectly crossed design.

The answer to the dimensionality question based on the analysis with this data set is that the 12 items can be considered as uni-dimensional based on the empirical data, although they are designed to tap on a diversity of processes and competencies. The uni-dimensional model fit was good (close), with a TLI of 0.94 and RMSEA of 0.037 (90% CI 0.029, 0.046). The uni-dimensional model is the result of an ordinal confirmatory factor model for the binary items using WLSMV and the same lavaan version as for the earlier analysis. For the delta parameterization the loadings vary between 0.272 and 0.776 and they are all significant ( p < 0.001).

For the model with loadings and dependencies showing in Tables 3 , ​ ,4, 4 , the latent variable correlations of Actions with Time and with Performance are negative. Hence, participants showing more activities are faster and perform less well in their collaborative problem solving. This is based on the United States dataset with the Xandar task. Successful participants take more time, perhaps a consequence of the previous two relationships. Multiplying the two negative correlations yields −0.473 × −0.723 = 0.346, which is higher than the 0.190 estimate of the correlation between Time and Performance. This explains that in an alternative but formally equivalent model with an effect of Actions on Time and on Performance, the correlation between the residuals of the latent variables Time and Performance is negative. However, the correlation of −0.260 in question is not significant ( p > 0.05).

The negative correlation between Actions and Time suggests that highly active students are fast and not so active students are slow. The combination of fast and active on the latent variables seem to reflect an impulsive and fast trial-and-error style. This strategy shows itself in the Xandar task as not very successful versus a slower, more thoughtful and apparently more successful style. It makes sense that respondents who are more deliberative may have more knowledge to bring to considering a successful solution, or be exhibiting more test effort in the Xandar context. We do not have the information to examine what is happening during the deliberation. This is in part because descriptions of the possible actions are not available in the data set. As well there is no interpretive information provided by PISA for the sample. This could include think-alouds where students describe why they are doing what they are doing. It could also have included qualitative response process information in which student explain their processes, in-depth interviews, or other approaches that supply interpretive information.

However, it makes of course sense that more actions take more time, which shows in the analysis of the dependencies between observed actions and time. This illustrates why it is informative to differentiate relationships between latent variables from relationships which show in dependencies.

Other important dependencies concern Part 4, which is a clearly reflective task, a kind of reflective and evaluative pause. The nature of the task may explain why performance is associated with more actions and requires more time, in contrast with Part 1 (agreeing on a strategy) where the association between actions and performance is negative. For instance, too much discussion on a strategy may signal a lack of structure.

For the result that the items examined can be considered as uni-dimensional although they are designed to tap on a diversity of processes and competencies, this suggests that the collaborative ability generalizes across processes. In other words, the collaborative competencies rely on a general underlying ability. The specificities of the processes are reflected in the extra dependencies. Part 4 involves monitoring and reflecting. This may explain why more activities and more time are associated with better performance. Part 1 by contrast involves planning and execution and representing and formulating. This may lead to better results if not based on trial and error (many actions) but on a structured and goal-oriented approach (less actions).

These dependencies suggest that, depending on the task, the collaborative ability may rely on a general underlying ability but be implemented through a different approach in various collaborative actions, as has been discussed in the literature ( Fiore et al., 2017 ; OECD, 2017b ; Eichmann et al., 2019 ). The special and specific status of Part 4 is also reflected in its lower loadings on all three latent variables (see standardized loadings).

Note that the extra dependencies here are not only considered for methodological reasons when variables stem from the same part. They may also reveal how subjects work on the tasks. This is consistent with the findings here. Parts such as 1 and 4 have a distinct theoretical description in the PISA framework. But how they draw on the collaborative ability can be seen in the empirical data to seemingly require different approaches as indicated in the process data.

Taken together, these results for the United States data set are consistent with problem solving performance modeled as invested time and number of actions.

Potential impacts underscore that it seems possible both to collect and to scale information on the collaborative ability. Measures may help provide intervention support, since in today’s world especially, teams with good collaborative skills are necessary in any group. Groups can range from families to corporations, public institutions, organizations, and government agencies ( OECD, 2013 ). Previously, dispositions to collaborate were reported based on the PISA data ( Scalise et al., 2016 ). Indicators of collaborative ability also may be needed to create adequate interventions to train collaboration skills and to change current levels of individual collaboration.

As previously reported, the disposition dimensions of collaborate , negotiate , and advocate / guide might be useful starting points for creating such interventions ( Scalise et al., 2016 ; OECD, 2017a ). Alternatively, the factor structure here may yield suggestions on additional interesting starting points. This could include structures by which a student may approach collaboration ( OECD, 2017b ; Wilson et al., 2017 ) but more interpretive information would be needed. This could be combined with how participatory a student is disposed to be in collaboration, along with his or her team leadership inclinations, and beliefs in the value or efficacy of collaboration ( Scalise et al., 2016 ).

Limitations to the analysis here include that only the United States data set of many countries available in the PISA data was analyzed. So this analysis should be extended to more countries and results compared in future work.

Also, from a statistical standpoint as discussed earlier, missing data were excluded listwise. In addition, minor but not major skewness was seen in two of the observed variables. Finally, multilevel modeling was not employed so the nested nature of students within schools was not taken into account.

TLI and RMSEA were reported here as the two fit indices since they seem most commonly used in the educational assessment field for large scale analyses. But there have been limited considerations for CPS on this topic.

For limitations from a conceptual standpoint, OECD releases a limited range of information, for instance items for only one of the 2015 collaborative problem solving tasks (Xandar) was released and collaborative actions were numbered but not described in the data set and data dictionary.

For implications of future work from this study, there are several. First, the era of analyzing process data and not only item response data in robust assessment tasks is upon us (many researchers including Praveen and Chandra, 2017 ). Approaches such as used here could be applied for other constructs, not just problem solving. Models can consider how to explore two types of relationship:

  • • at the level of general individual differences (the factors)
  • • at extra dependencies, which are direct effects and correlated residuals (independent of the factors)

These extra dependencies may provide a window on the underlying process dynamics, see Figure 1 . It should be noted for implications for future work that it would be helpful if a range of simplified visualizations could be developed for such complex analyses. Standard plots after including dependencies seemed too complex to be fully useful.

For extensions to the specific modeling here, it would be important as discussed earlier to explore fitting the same or similar models across data sets from other countries ( Thomas and Inkson, 2017 ). This could be augmented by also modeling potential country-level effects at the item level, by exploring differential item functioning. Furthermore it would be interesting to consider covariates available in the PISA student questionnaire data set (SQ) in relation to the collaborative ability examined here. This could include indicators for dispositions for collaborative problem solving that moved forward to the main PISA study ( Scalise et al., 2016 ). These indicators include student-level indicators available in the CPS SQ data set regarding self-report of dispositions toward cooperation, guiding, and negotiating.

It should also be mentioned that other very interesting student-level indicators regarding additional preferences in collaboration had to be dropped from the PISA main study. This was due to time limitations. Dropped indicators included dispositions toward collaborative leadership , as well as student-level indicators of in-school and out-of-school collaborative opportunities . While these were not possible to include in the main study due to time limitations for the PISA administration, the indicators were part of the field testing. They could be very interesting to administer at the country-level in other national or international assessments.

Teacher-level indicators are also available in the PISA data set that provide information on opportunity to learn (OtL) for students in the PISA CPS. Data include classroom-level OtL reports of team activities, grouping practices, types of collaborative activities, and types of rewards provided for engaging in successful team work. Exploring relationships here might allow more reflection on connections to potential interventions. The PISA data are cross-sectional but might help to inform research studies within countries.

In closing, it is important to mention that the creation and delivery of the innovative PISA CPS instrument included both simulated collaboration of a hard-to-measure construct ( Scalise, 2012 ) and sharing of some process data. This was critical to the examination here, as has been the case for other collaboration-oriented assessments ( Greiff et al., 2014 , 2015 , 2016 ). This analysis underscores that addressing challenges of education in the 21st century may continue to require new data sources, to address new challenges for education worldwide.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

1 www.oecd.org/pisa/data/

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.01280/full#supplementary-material

  • Adejumo G., Duimering R. P., Zhong Z. (2008). A balance theory approach to group problem solving. Soc. Netw. 30 83–99. 10.1016/j.socnet.2007.09.001 [ CrossRef ] [ Google Scholar ]
  • Avouris N., Dimitracopoulou A., Komis V. (2003). On evaluation of collaborative problem solving: methodological issues of interaction analysis. J. Comput. Hum. Behav. 19 147–167. [ Google Scholar ]
  • Binkley M., Erstad O., Herman J., Raizen S., Ripley M., Miller-Ricci M., et al. (2012). “ Defining twenty-first century skills ,” in Assessment and Teaching of 21st Century Skills , eds Griffin P., McGaw B., Care E. (New York, NY: Springer; ). [ Google Scholar ]
  • Binkley M., Erstad O., Herman J., Raizen S., Ripley M., Rumble M. (2010). “ Assessment and teaching of 21st century skills: defining 21st century skills ,” in White Paper released at the Learning and Technology World Forum 2010 , (London: ). [ Google Scholar ]
  • Cohen J., Cohen P., West S. G., Aiken L. S. (2002). Applied Multiple Regression/Correlationanalysis for the Behavioral Sciences , 3rd Edn Hove: Psychology Press. [ Google Scholar ]
  • Cooke N. J., Kiekel P. A., Salas E., Stout R., Bowers C., Cannon- Bowers J. (2003). Measuring team knowledge: a window to the cognitive underpinnings of team performance. Group Dyn. 7 179–219. [ Google Scholar ]
  • Eichmann B., Goldhammer F., Greiff S., Pucite L., Naumann J. (2019). The role of planning in complex problem solving. Comput. Educ. 128 1–12. 10.1016/j.compedu.2018.08.004 [ CrossRef ] [ Google Scholar ]
  • Fiore S. M., Graesser A., Greiff S., Griffin P., Gong B., Kyllonen P., et al. (2017). Collaborative Problem Solving: Considerations for the National Assessment of Educational Progress. Available at: https://nces.ed.gov/nationsreportcard/pdf/researchcenter/collaborative_problem_solving.pdf (accessed April 11, 2018). [ Google Scholar ]
  • Foltz P. W., Martin M. J. (2008). “ Automated communication analysis of teams ,” in Team Effectiveness in Complex organisations and Systems: Cross-Disciplinary Perspectives and Approaches , eds Salas E., Goodwin G. F., Burke S. (Boca Raton, FL: CRC Press; ). [ Google Scholar ]
  • Graesser A. C., Jeon M., Dufty D. (2008). Agent technologies designed to facilitate interactive knowledge construction. Discourse Process. 45 298–322. 10.1080/01638530802145395 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Krkovic K., Hautamäki J. (2016). The prediction of problem solving assessed via microworlds: the relative importance of fluid reasoning and working memory. Eur. J. Psychol. Assess. 32 298–306. 10.1027/1015-5759/a000263 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Avvisati F. (2015). Computer-generated log-file analyses as a window into students’ minds? A showcase study based on the PISA 2012 assessment of problem solving. Comput. Educ. 91 92–105. 10.1016/j.compedu.2015.10.018 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Csapó B., Demetriou A., Hautamäki J., Graesser A., et al. (2014). Domain-general problem solving skills and education in the 21st century. Educ. Res. Rev. 13 74–83. 10.1016/j.edurev.2014.10.002 [ CrossRef ] [ Google Scholar ]
  • McDaniel M. A., Morgeson F. P., Finnegan E. B., Campion M. A., Braverman E. P. (2001). Use of situational judgment tests to predict job performance: a clarification of the literature. J. Appl. Psychol. 86 730–740. 10.1037/0021-9010.86.4.730 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • OECD (2013). PISA 2015: Draft Collaborative Problem Solving Framework. Available at: http://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf (accessed September 26, 2014). [ Google Scholar ]
  • OECD (2014). PISA 2012 Results: Creative Problem Solving: Students’ Skills in Tackling Real-Life Problems (Volume V). Paris: OECD. [ Google Scholar ]
  • OECD (2016). Description of the Released Unit from the 2015 PISA Collaborative Problem-Solving Assessment, Collaborative Problem-Solving Skills, and Proficiency Levels. Available at: https://www.oecd.org/pisa/test/CPS-Xandar-scoring-guide.pdf (accessed December 10, 2018). [ Google Scholar ]
  • OECD (2017a). Chapter 17: Questionnaire Design and Computer-based Questionnaire Platform. In PISA 2015 Technical Report. Available at: http://www.oecd.org/pisa/data/2015-technical-report/ (accessed September 21, 2017). [ Google Scholar ]
  • OECD (2017b). PISA 2015 Results (Volume V): Collaborative Problem Solving, PISA. Paris: OECD Publishing. [ Google Scholar ]
  • O’Neil H. F., Chung G., Brown R. (1997). “ Use of networked simulations as a context to measure team competencies ,” in Workforce readiness: Competencies and assessment , ed. O’Neil H. F. (Mahwah, NJ: Lawrence Erlbaum Associates; ), 411–452. [ Google Scholar ]
  • Praveen S., Chandra U. (2017). Influence of structured, semi-structured, unstructured data on various data models. Int. J. Sci. Eng. Res. 8 67–69. [ Google Scholar ]
  • R Core Team (2018). R: A language and environment for statistical computing. Vienna: R Core Team. [ Google Scholar ]
  • Rosseel Y. (2012). lavaan: an r package for structural equation modeling. J. Stat. Softw. 48 1–36. 10.3389/fpsyg.2014.01521 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rummel N., Spada H. (2005). Learning to collaborate: an instructional approach to promoting collaborative problem solving in computer-mediated settings. J. Learn. Sci. 14 201–241. 10.1207/s15327809jls1402_2 [ CrossRef ] [ Google Scholar ]
  • Scalise K. (2012). “ Using technology to assess hard-to-measure constructs in the CCSS and to expand accessibility ,” in Proceedings of the Invitational Research Symposium on Technology Enhanced Assessments , (Princeton, NJ: ). [ Google Scholar ]
  • Scalise K., Binkley M. (2009). “ Transforming educational practice by transforming assessment: update on assessment & teaching of 21st century skills ,” in PISA Problem Solving 2012 , (Santa Barbara, CA: ). [ Google Scholar ]
  • Scalise K., Mustafić M., Greiff S. (2016). “ Dispositions for collaborative problem solving ,” in Assessing Context of Learning World-Wide (Methodology of Educational Measurement and Assessment Series) , eds Kuger S., Klieme E., Jude N., Kaplan D. (Dordrecht: Springer; ). [ Google Scholar ]
  • Thomas D. C., Inkson K. (2017). “ Communicating and negotiating across cultures ,” in Cultural intelligence: Surviving and Thriving in the Global Vilage , 3rd Edn, (Oakland, CA: Berrett-Koehler; ), 76–97. [ Google Scholar ]
  • Vogel F., Wecker C., Kollar I., Fischer F. (2016). Socio-cognitive scaffolding with computer-supported collaboration scripts: a meta-analysis. Educ. Psychol. Rev. 29 477–511. 10.1007/s10648-016-9361-7 [ CrossRef ] [ Google Scholar ]
  • Wilson M., Gochyyev P., Scalise K. (2017). Modeling data from collaborative assessments: learning in digital interactive social networks. J. Educ. Meas. 54 85–102. 10.1111/jedm.12134 [ CrossRef ] [ Google Scholar ]

LIVES IN THE BALANCE

DIFFERENT LENSES, DIFFERENT PRACTICES, AND DIFFERENT OUTCOMES

Help kids solve the problems that are causing their concerning behavior…without shame, blame, or conflict.

WHERE COMPASSION AND SCIENCE INTERSECT

Collaborative & Proactive Solutions (CPS) is the evidence-based, trauma-informed, neurodiversity affirming model of care that helps caregivers focus on identifying the problems that are causing concerning behaviors in kids and solving those problems collaboratively and proactively. The model is a departure from approaches emphasizing the use of consequences to modify concerning behaviors. In families, general and special education schools, inpatient psychiatry units, and residential and juvenile detention facilities, the CPS model has a track record of dramatically improving behavior and dramatically reducing or eliminating discipline referrals, detentions, suspensions, restraints, and seclusions. The CPS model is non-punitive, non-exclusionary, trauma-informed, transdiagnostic, and transcultural.

And this website is the hub of free resources on the CPS model.

Young girl smiling

WHAT YOU NEED TO GIVE KIDS WHAT THEY NEED

Parents & families, educators & schools, pediatricians & family physicians.

facebook

15 Main Street, Suite 200, Freeport, Maine 04032

Lives In the Balance All Rights Reserved. Privacy Policy Disclaimer Gift Acceptance Policy

Sign up for our Newsletter CPS Methodology Podcasts about the CPS Model Connect with our Community Research

Why We’re Here Meet the Team Our Board of Directors Higher Ed Advisory Board

Get in Touch Make a Donation

  • PARENTS & FAMILIES
  • EDUCATORS & SCHOOLS
  • PEDIATRICIANS & FAMILY PHYSICIANS
  • CPS WITH YOUNG KIDS
  • WORKSHOPS & TRAININGS
  • CPS PAPERWORK
  • WHAT’S THE DIFFERENCE?
  • PUBLIC AWARENESS
  • SCHOLARSHIPS
  • BECOME AN ADVOCATOR
  • RESOURCES FOR ADVOCATORS
  • TAKE ACTION

Exploring collaborative problem solving in virtual laboratories: a perspective of socially shared metacognition

  • Published: 18 May 2022
  • Volume 35 , pages 296–319, ( 2023 )

Cite this article

  • Hengtao Tang   ORCID: orcid.org/0000-0002-8846-7654 1 ,
  • Okan Arslan 2 ,
  • Wanli Xing 3 &
  • Tugba Kamali-Arslantas 4  

1259 Accesses

3 Citations

1 Altmetric

Explore all metrics

Socially shared metacognition is important for effective collaborative problem solving in virtual laboratory settings, A holistic account of socially shared metacognition in virtual laboratory settings is needed to advance our understanding, but previous studies have only focused on the isolated effect of each dimension on problem solving. This study thus applied learning analytics techniques to develop a comprehensive understanding of socially shared metacognition during collaborative problem solving in virtual laboratories. We manually coded 126 collaborative problem-solving scenarios in a virtual physics laboratory and then employed K-Means clustering analysis to identify patterns of socially shared metacognition. Four clusters were discovered. Statistical analysis was performed to investigate how the clusters were associated with the outcome of collaborative problem solving and also how they related to the difficulty level of problems. The findings of this study provided theoretical implications to advance the understanding of socially shared metacognition in virtual laboratory settings and also practical implications to foster effective collaborative problem solving in those settings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

collaborative problem solving de

Similar content being viewed by others

collaborative problem solving de

What Makes Metacognition as Socially Shared in Mathematical Problem Solving?

collaborative problem solving de

Socially-Shared Metacognitive Regulation in Collaborative Science Learning

The social essentials of learning: an experimental investigation of collaborative problem solving and knowledge construction in mathematics classrooms in australia and china.

Man Ching Esther Chan, David Clarke & Yiming Cao

Allen, M. (Ed.). (2017). The SAGE encyclopedia of communication research methods . SAGE publications.

Andrews-Todd, J., & Forsyth, C. M. (2020). Exploring social and cognitive dimensions of collaborative problem solving in an open online simulation-based task. Computers in Human Behavior, 104 , 105759.

Article   Google Scholar  

Artz, A. F., & Armour-Thomas, E. (1992). Development of a cognitive-metacognitive framework for protocol analysis of mathematical problem solving in small groups. Cognition and Instruction, 9 (2), 137–175.

Corwin, L. A., Runyon, C. R., Ghanem, E., Sandy, M., Clark, G., Palmer, G. C., Reichler, S., Rodenbusch, S. E., & Dolan, E. L. (2018). Effects of discovery, iteration, and collaboration in laboratory courses on undergraduates’ research career intentions fully mediated by student ownership. CBE—Life Sciences Education, 17 (2), ar20.

De Backer, L., Van Keer, H., & Valcke, M. (2015). Exploring evolutions in reciprocal peer tutoring groups’ socially shared metacognitive regulation and identifying its metacognitive correlates. Learning and Instruction, 38 , 63–78.

De Backer, L., Van Keer, H., & Valcke, M. (2020). Variations in socially shared metacognitive regulation and their relation with university students’ performance. Metacognition and Learning, 15 , 233–259.

Ding, N., & Harskamp, E. G. (2011). Collaboration and peer tutoring in chemistry laboratory education. International Journal of Science Education, 33 (6), 839–863.

Du, X., Dai, M., Tang, H., Hung, J., Li, H., & Zheng, J. (2022). A multimodal analysis of college students’ collaborative problem solving in virtual experimentation activities: A perspective of cognitive load. Journal of Computing in Higher Education . https://doi.org/10.1007/s12528-022-09311-8

Eseryel, D., Ge, X., Ifenthaler, D., & Law, V. (2011). Dynamic modeling as a cognitive regulation scaffold for developing complex problem-solving skills in an educational massively multiplayer online game environment. Journal of Educational Computing Research, 45 (3), 265–286.

Eshuis, E. H., Ter Vrugte, J., Anjewierden, A., Bollen, L., Sikken, J., & De Jong, T. (2019). Improving the quality of vocational students’ collaboration and knowledge acquisition through instruction and joint reflection. International Journal of Computer-Supported Collaborative Learning, 14 (1), 53–76.

Grau, V., & Whitebread, D. (2012). Self and social regulation of learning during collaborative activities in the classroom: The interplay of individual and group cognition. Learning and Instruction, 22 (6), 401–412.

Hadwin, A. F., Järvelä, S., & Miller, M. (2011). Self-regulated, co-regulated, and socially shared regulation of learning. Handbook of Self-Regulation of Learning and Performance, 30 , 65–84.

Google Scholar  

Hadwin, A. F., Oshige, M., Gress, C. L., & Winne, P. H. (2010). Innovative ways for using gStudy to orchestrate and research social aspects of self-regulated learning. Computers in Human Behavior, 26 (5), 794–805.

Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review, 16 (3), 235–266.

Hurme, T. R., Merenluoto, K., & Järvelä, S. (2009). Socially shared metacognition of pre-service primary teachers in a computer-supported mathematics course and their feelings of task difficulty: A case study. Educational Research and Evaluation, 15 (5), 503–524.

Iiskala, T., Vauras, M., Lehtinen, E., & Salonen, P. (2011). Socially shared metacognition of dyads of pupils in collaborative mathematical problem-solving processes. Learning and Instruction, 21 (3), 379–393.

Iiskala, T., Volet, S., Lehtinen, E., & Vauras, M. (2015). Socially shared metacognitive regulation in asynchronous CSCL in science: Functions, evolution and participation. Frontline Learning Research, 3 (1), 78–111.

Jang, H. (2016). Identifying 21st century STEM competencies using workplace data. Journal of Science Education and Technology, 25 (2), 284–301.

Järvelä, S., & Hadwin, A. F. (2013). New Frontiers: Regulating learning in CSCL. Educational Psychologist, 48 (1), 25–39.

Järvelä, S., Kirschner, P. A., Panadero, E., Malmberg, J., Phielix, C., Jaspers, J., Koivuniemi, M., & Järvenoja, H. (2015). Enhancing socially shared regulation in collaborative learning groups: Designing for CSCL regulation tools. Educational Technology Research and Development, 63 (1), 125–142.

Järvelä, S., Kirschner, P. A., Hadwin, A., Järvenoja, H., Malmberg, J., Miller, M., & Laru, J. (2016a). Socially shared regulation of learning in CSCL: Understanding and prompting individual-and group-level shared regulatory activities. International Journal of Computer-Supported Collaborative Learning, 11 (3), 263–280.

Järvelä, S., Malmberg, J., & Koivuniemi, M. (2016b). Recognizing socially shared regulation by using the temporal sequences of online chat and logs in CSCL. Learning and Instruction, 42 , 1–11.

Kim, M. C., & Hannafin, M. J. (2011). Scaffolding problem solving in technology-enhanced learning environments (TELEs): Bridging research and theory with practice. Computers & Education, 56 (2), 403–417.

Kuo, F. R., Hwang, G. J., Chen, S. C., & Chen, S. Y. (2012). A cognitive apprenticeship approach to facilitating web-based collaborative problem solving.  Journal of Educational Technology & Society ,  15 (4).

Kwon, K., Song, D., Sari, A. R., & Khikmatillaeva, U. (2019). Different types of collaborative problem-solving processes in an online environment: Solution oriented versus problem oriented. Journal of Educational Computing Research, 56 (8), 1277–1295.

Lajoie, S. P., Lee, L., Poitras, E., Bassiri, M., Kazemitabar, M., Cruz-Panesso, I., Hmelo-Silver, C., Wiseman, J., Chan, L. K., & Lu, J. (2015). The role of regulation in medical student learning in small groups: Regulating oneself and others’ learning and emotions. Computers in Human Behavior, 52 , 601–616.

Lajoie, S. P., & Lu, J. (2012). Supporting collaboration with technology: Does shared cognition lead to co-regulation in medicine? Metacognition and Learning, 7 (1), 45–62.

Lazakidou, G., & Retalis, S. (2010). Using computer supported collaborative learning strategies for helping students acquire selfregulated problem-solving skills in mathematics. Computers & Education, 54 (1), 3–13.

Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review , 46 (5), 31–40.

Malmberg, J., Järvelä, S., Järvenoja, H., & Panadero, E. (2015). Promoting socially shared regulation of learning in CSCL: Progress of socially shared regulation among high-and low-performing groups. Computers in Human Behavior, 52 , 562–572.

May, D. (2020). Cross reality spaces in engineering education–Online laboratories for supporting international student collaboration in merging realities. International Journal of Online and Biomedical Engineering, 16 (03), 4–26.

Mercier, J., & Frederiksen, C. (2008). The structure of the help-seeking process in collaboratively using a computer coach in problem-based learning. Computers & Education, 51 (1), 17–33.

Molenaar, I., & Järvelä, S. (2014). Sequential and temporal characteristics of self and socially regulated learning. Metacognition and Learning, 9 (2), 75–85.

Nelson, T. O. (1999). Cognition versus metacognition. In R. J. Sternberg (Ed.), The nature of cognition (pp. 625–641). The MIT Press.

Organization for Economic Co-operation and Development. (2017). PISA 2015 results (volume V): Collaborative problem solving . https://doi.org/10.1787/19963777

Ottenbacher, K. (1992). Impact of random assignment on study outcome: An empirical examination. Controlled Clinical Trials, 13 (1), 50–61.

Panadero, E., & Järvelä, S. (2015). Socially shared regulation of learning: A review.  European Psychologist .

Perrotta, C., & Williamson, B. (2018). The social life of learning analytics: Cluster analysis and the ‘performance’ of algorithmic education. Learning, Media and Technology, 43 (1), 3–16.

Phielix, C., Prins, F. J., & Kirschner, P. A. (2010). Awareness of group performance in a CSCL-environment: Effects of peer feedback and reflection. Computers in Human Behavior, 26 (2), 151–161. https://doi.org/10.1016/j.chb.2009.10.011

Phielix, C., Prins, F. J., Kirschner, P. A., Erkens, G., & Jaspers, J. (2011). Group awareness of social and cognitive performance in a CSCL environment: Effects of a peer feedback and reflection tool. Computers in Human Behavior, 27 (3), 1087–1102. https://doi.org/10.1016/j.chb.2010.06.024

Reeves, S., & Crippen, K. (2021). Virtual laboratories in undergraduate science and engineering courses: A systematic review, 2009–2019. Journal of Science Education and Technology, 30 , 16–30.

Rogat, T. K., & Linnenbrink-Garcia, L. (2011). Socially shared regulation in collaborative groups: An analysis of the interplay between quality of social regulation and group processes. Cognition and Instruction, 29 (4), 375–415.

Salonen, P., Vauras, M., & Efklides, A. (2005). Social interaction-what can it tell us about metacognition and coregulation in learning? European Psychologist, 10 (3), 199–208.

Sarle, W. S. (1983). Cubic clustering criterion. SAS Technical Report A-108 . SAS Institution Inc., Cary, NC.

Schoor, C., & Bannert, M. (2012). Exploring regulatory processes during a computer-supported collaborative learning task using process mining. Computers in Human Behavior, 28 (4), 1321–1331.

Sheorey, T. (2014). Empirical evidence of relationship between virtual lab development and students learning through field trials on vlab on mechatronics. International Journal of Information and Education Technology, 4 (1), 97.

Srougi, M. C., & Miller, H. B. (2018). Peer learning as a tool to strengthen math skills in introductory chemistry laboratories. Chemistry Education Research and Practice, 19 (1), 319–330.

Su, Y., Li, Y., Hu, H., & Rosé, C. P. (2018). Exploring college English language learners’ self and social regulation of learning during wiki-supported collaborative reading activities. International Journal of Computer-Supported Collaborative Learning , 1–26.

Tang, H. (2021a). Person-centered analysis of self-regulated learner profiles in MOOCs: A cultural perspective. Educational Technology Research and Development , 69 (2), 1247–1269. https://doi.org/10.1007/s11423-021-09939-w

Tang, H. (2021b). Teaching teachers to use technology through Massive Open Online Course: Perspectives of interaction equivalency. Computers & Education , 174 (2021), 104307. https://doi.org/10.1016/j.compedu.2021.104307

Tang, H., & Bao, Y. (2021). A person-centered approach to understanding K-12 teachers’ barriers in implementing open educational resources. Distance Education , 42 (4), 582–598. https://doi.org/10.1080/01587919.2021.1986371

Tang, H., Dai, M., Yang, S., Du, X., Hung, J., & Li, H. (2022). Using multimodal analytics to systemically investigate online collaborative problem-solving. Distance Education. https://doi.org/10.1080/01587919.2022.2064824

Tang, H., & Xing, W. (2022). Massive Open Online Courses for professional certificate programs? A perspective of professional learners’ longitudinal participation patterns. Australasian Journal of Educational Technology , 38 (1), 136–147. https://doi.org/10.14742/ajet.5768

Tang, H., Xing, W., & Pei, B. (2018). Exploring the temporal dimension of forum participation in MOOCs. Distance Education, 39 (3), 353–372.

Tang, H., Xing, W., & Pei, B. (2019). Time really matters: Understanding the temporal dimension of online learning using educational data mining. Journal of Educational Computing Research, 57 (5), 1326–1347.

Tawfik, A. A., Sánchez, L., & Saparova, D. (2014). The effects of case libraries in supporting collaborative problem-solving in an online learning environment. Technology, Knowledge and Learning, 19 (3), 337–358.

Tho, S. W., & Yeung, Y. Y. (2018). An implementation of remote laboratory for secondary science education. Journal of Computer Assisted Learning, 34 (5), 629–640.

Vauras, M., Iiskala, T., Kajamies, A., Kinnunen, R., & Lehtinen, E. (2003). Shared-regulation and motivation of collaborating peers: A case analysis. Psychologia, 46 (1), 19–37.

Volet, S., Vauras, M., & Salonen, P. (2009). Self-and social regulation in learning contexts: An integrative perspective. Educational Psychologist, 44 (4), 215–226.

Volet, S., Bueno, L. J., & Bigand, E. (2013). Music, emotion, and time perception: The influence of subjective emotional valence and arousal? Frontiers in Psychology, 4 , 417.

Whitebread, D., Coltman, P., Pasternak, D. P., Sangster, C., Grau, V., Bingham, S., Almeqdad, Q., & Demetriou, D. (2009). The development of two observational tools for assessing metacognition and self-regulated learning in young children. Metacognition and Learning, 4 (1), 63–85.

Xing, W., Tang, H., & Pei, B. (2019). Beyond positive and negative emotions: Looking into the role of achievement emotions in discussion forums of MOOCs. The Internet and Higher Education , 43 , 100690. https://doi.org/10.1016/j.iheduc.2019.100690

Yang, B., Tang, H., Hao, L., & Rose, J. (2022). Untangling chaos in discussion forums: A temporal analysis of topic-relevant forum posts in MOOCs. Computers & Education , 178 (2022), 104402. https://doi.org/10.1016/j.compedu.2021.104402

Download references

Acknowledgements

The authors would like to thank the Concord Consortium, especially Dr. Paul Horwitz and Cynthia McIntyre for supporting this work.

Author information

Authors and affiliations.

Department of Educational Studies, University of South Carolina, Columbia, SC, 29208, USA

Hengtao Tang

Department of Educational Psychology & Leadership, Texas Tech University, Lubbock, TX, 79409, USA

Okan Arslan

School of Teaching and Learning, College of Education, University of Florida, Gainesville, FL, 32611, USA

Department of Special Education, Aksaray University, Aksaray, Turkey

Tugba Kamali-Arslantas

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hengtao Tang .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Tang, H., Arslan, O., Xing, W. et al. Exploring collaborative problem solving in virtual laboratories: a perspective of socially shared metacognition. J Comput High Educ 35 , 296–319 (2023). https://doi.org/10.1007/s12528-022-09318-1

Download citation

Accepted : 28 March 2022

Published : 18 May 2022

Issue Date : August 2023

DOI : https://doi.org/10.1007/s12528-022-09318-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Socially shared metacognition
  • Virtual laboratory
  • Collaborative problem solving
  • Learning analytics
  • Find a journal
  • Publish with us
  • Track your research

Alona Pulde, M.D., and Matthew Lederman M.D.

Collaborative Strategies for Screen Time

To manage screen time, understand motivations, negotiate balance, and build trust..

Posted April 12, 2024 | Reviewed by Monica Vilhauer

  • A Parent's Role
  • Find a family therapist near me
  • Empathic parenting sets a positive tone for constructive screen time discussions.
  • Recognizing underlying motivations helps us approach screen time discussions with empathy.
  • Collaborating with children on screen time fosters ownership and cooperation over conflict.

In today's digital age, managing screen time in families with multiple children can feel like navigating a labyrinth of conflicting needs and desires. Parents find themselves caught in a tug-of-war between wanting to foster healthy habits and dealing with the day-to-day challenges of family life. This includes creating a delicate balance of allowing their children to indulge in screen time while ensuring that it doesn't completely consume their lives. Let's explore how principles from a Collaborative Non-Permissive Parenting approach can offer practical strategies for this pervasive challenge.

Understanding the Needs Behind the Screen Time

Recognizing the diverse needs of each child is the cornerstone of effective screen time management . It's important for parents to acknowledge the unique motivations behind their children's screen time behavior. Although the potential negative impact on health and family dynamics cannot be ignored, parents can approach discussions around screen time with empathy and understanding by identifying needs.

With this intention, they may discover that a child’s insistence on engaging in hours of screen time may come from a need for autonomy, competence, or even relaxation after a long day of school. They get to enter a world where they have full choice in what happens and what they do, while also experiencing competence in doing things that they can’t do as well in the real world. Devices also serve to distract us from daily stressors, so they can meet needs for self-care and regulation of the nervous systems. Your children may even meet their needs for meaning and purpose in the “dream” game world as they try to achieve an important mission or goal. The games also meet needs for fun and play which are essential for all people, especially children. At the same time, children might be less connected to their need for health and potentially unaware that device usage impacts health negatively.

Awareness of all these needs can shift the conversation from conflict around watching or avoiding screens to connection around meeting needs in a way that works for both parents and children. This could mean setting screen time limits together, along with finding new and exciting activities to engage in that don’t involve a device.

Empathy and Self-Regulation

Parents can lead by example in setting the tone for screen time discussions. The first step is to regulate yourself when you are feeling worried, frustrated, and overwhelmed. There are many tools and techniques you can use, including breathing exercises, guided imagery, mantras, listening to music, and going for a walk among others. The goal is to take a moment to connect to your own feelings and needs (worry about device use being harmful and addictive, longing for more presence from your children, or wanting to ensure you are optimizing their health and well-being) while naming your intention to connect with your children in a way that cares about your needs as well as theirs.

It is helpful to remember that their device usage at this moment isn’t going to make or break their health. However, the way you show up around their screen time can either build trust and connection with your children or erode it. By practicing self-regulation and empathy, you can create a safe space for open dialogue. For example, instead of reacting with frustration when your child refuses to turn off their device, you could take a moment to understand the feelings and needs driving the behavior.

This could sound something like, “It seems you are really enjoying your game. I am guessing you like the fun it brings. Maybe even the autonomy to enter a world where you have full choice in what happens and the confidence to do things you might not be as comfortable with in the real world? While I really get that, I also have a concern about how late it’s getting and a desire to make sure you get enough rest tonight. Do you have any ideas on how both of our needs could get met?” With this approach, you validate your child’s experience, express your concerns, and invite a collaborative discussion on what to do next.

Collaborative Problem-Solving

Collaboration is essential to build trust between parents and children. The connection that is fostered by this intention allows for expansive and creative solutions that aim to meet everyone’s needs. As such, it is not permissive (children don’t get to just do what they want) nor authoritarian (parents also don’t get to do only what they want). The more attached you are to the needs of all and the less attached you are to a certain time limit for the device, the more strategies will become available.

For example, maybe the device serves certain needs, while others are met differently (like by playing outside together, finding meaning and purpose in books or hobbies, learning what competencies they wish to master, and finding activities that can provide those skills, etc). Involving children in the decision-making process empowers them to take ownership of their habits on and off screen. Children then maintain autonomy while learning to connect to their internal motivations and desires. Getting off the screens becomes a choice they are making with you, not because of you. Over time, trust around this collaborative approach will grow and these conversations will become easier and quicker.

collaborative problem solving de

In the dynamic landscape of modern family life, navigating screen time requires more than just rules and restrictions — it necessitates empathy, collaboration, and understanding. Taking a Collaborative Non-Permissive Parenting approach can transform moments of tension into opportunities for growth and connection. Through empathy and collaboration, parents can empower their children to make informed choices and foster a sense of unity that enriches every aspect of family life. Remember, navigating screen time may not always be straightforward, but with patience, creativity , and a willingness to listen, families can discover the joy of shared experiences and meaningful connections that extend far beyond the digital realm.

Alona Pulde, M.D., and Matthew Lederman M.D.

Alona Pulde, M.D., and Matthew Lederman, M.D., New York Times bestselling authors and physicians, integrate medicine, nutrition, and nonviolent communication for lasting health and joy through WeHeal and webe kälm.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

Collaborative Robotics is prioritizing ‘human problem solving’ over humanoid forms

collaborative problem solving de

Humanoids have sucked a lot of the air out of the room. It is, after all, a lot easier to generate press for robots that look and move like humans. Ultimately, however, both the efficacy and scalability of such designs have yet to be proven out. For a while now, Collaborative Robotics founder Brad Porter has eschewed robots that look like people. Machines that can potentially reason like people, however, is another thing entirely.

As the two-year-old startup’s name implies, Collaborative Robotics (Cobot for short) is interested in the ways in which humans and robots will collaborate, moving forward. The company has yet to unveil its system, though last year, Porter told me that the “novel cobot” system is neither humanoid nor a mobile manipulator mounted to the back of an autonomous mobile robot (AMR).

The system has, however, begun to be deployed in select sites.

“Getting our first robots in the field earlier this year, coupled with today’s investment, are major milestones as we bring cobots with human-level capability into the industries of today,” Porter says. “We see a virtuous cycle where more robots in the field lead to improved AI and a more cost-effective supply chain.”

Further deployment will be helped along by a fresh $100 million Series B, led by General Catalyst and featuring Bison Ventures, Industry Ventures and Lux Capital. That brings the Bay Area firm’s total funding up to $140 million. General Catalyst’s Teresa Carlson is also joining the company in an advisory role.

Cobot has the pedigree, as well, with staff that includes former Apple, Meta, Google, Microsoft, NASA and Waymo employees. Porter himself spent more than 13 years at Amazon. When his run with the company ended in summer 2020, he was leading the retail giant’s industrial robotics team.

Amazon became one of the world’s top drivers and consumer of industrial robotics during that time, and the company’s now ubiquitous AMRs stand as a testament to the efficiency of pairing human and robot workers together.

AI will, naturally, be foundational to the company’s promise of “human problem solving,” while the move away from the humanoid form factor is a bid, in part, to reduce the cost of entry for deploying these systems.

IMAGES

  1. Practice collaboration and problem solving in teams by completing

    collaborative problem solving de

  2. collaborative problem solving strategies

    collaborative problem solving de

  3. 5 Expert Collaborative Problem-Solving Strategies

    collaborative problem solving de

  4. Collaborative Problem Solving

    collaborative problem solving de

  5. Collaborative Problem Solving Worksheet

    collaborative problem solving de

  6. 5 Tips to Make Collaborative Problem Solving Work for Your Team

    collaborative problem solving de

VIDEO

  1. Collaborative Problem Solving: Strategies for Success

  2. Collaborative problem-solving, globally

  3. Collaborative Computer-Based Tasks: Maximizing Teamwork

  4. Math Quest Problem Solving

  5. Mastering Collaborative Problem-Solving: Unlock Group Study Success!

  6. How to Develop Learners’ Collaborative Problem Solving Skills

COMMENTS

  1. Think:Kids : Collaborative Problem Solving®

    Collaborative Problem Solving® (CPS) At Think:Kids, we recognize that kids with challenging behavior don't lack the will to behave well. They lack the skills to behave well. Our Collaborative Problem Solving (CPS) approach is proven to reduce challenging behavior, teach kids the skills they lack, and build relationships with the adults in ...

  2. PDF Collaborative Problem Solving

    distinction between individual problem solving and collaborative problem solving is the social component in the context of a group task. This is composed of processes such as the need for communication, the exchange of ideas, and shared identification of the problem and its elements. The PISA 2015 framework defines CPS as follows:

  3. The effectiveness of collaborative problem solving in promoting

    The findings show that (1) collaborative problem solving is an effective teaching approach to foster students' critical thinking, with a significant overall effect size (ES = 0.82, z = 12.78, P ...

  4. PDF 2 What is collaborative problem solving?

    PISA 2015 defines collaborative problem-solving competency as: the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills and efforts

  5. PDF Pisa 2015 Collaborative Problem-solving Framework July 2017

    Collaborative problem solving (CPS) is a critical and necessary skill used in education and in the workforce. While problem solving as defined in PISA 2012 (OECD, 2010) relates to individuals working alone on resolving problems where a method of solution is not immediately obvious, in CPS, individuals

  6. Collaborative Problem Solving: The Ultimate Guide

    As defined by Webster's Dictionary, the word collaborate is to work jointly with others or together, especially in an intellectual endeavor. Therefore, collaborative problem solving (CPS) is essentially solving problems by working together as a team. While problems can and are solved individually, CPS often brings about the best resolution to a ...

  7. How to ace collaborative problem solving

    To solve any problem—whether personal (eg, deciding where to live), business-related (eg, raising product prices), or societal (eg, reversing the obesity epidemic)—it's crucial to first define the problem. In a team setting, that translates to establishing a collective understanding of the problem, awareness of context, and alignment of ...

  8. Full article: Measuring collaborative problem solving: research agenda

    Defining collaborative problem solving. Collaborative problem solving refers to "problem-solving activities that involve interactions among a group of individuals" (O'Neil et al., Citation 2003, p. 4; Zhang, Citation 1998, p. 1).In a more detailed definition, "CPS in educational setting is a process in which two or more collaborative parties interact with each other to share and ...

  9. Parenting, Teaching and Treating Challenging Kids: The Collaborative

    This introductory training provides a foundation for professionals and parents interested in learning the evidence-based approach to understanding and helping children and adolescents with behavioral challenges called Collaborative Problem Solving (CPS). This online training serves as the prerequisite for our professional intensive training.

  10. Advancing the Science of Collaborative Problem Solving

    Collaborative problem solving (CPS) is an important 21st century skill that is increasingly recognized as being critical to efficiency, effectiveness, and innovation in the modern global economy (Fiore, Graesser, & Greiff, 2018; Organisation for Economic Co-operation and Development [OECD], 2017a).

  11. Exploring collaborative problem solving in virtual laboratories: a

    Collaborative problem solving is a fundamental skill for qualified STEM professionals (Jang, 2016).Solving complex problems in STEM education requires college students to take in diverse perspectives and develop multiple problem presentations (Eseryel et al., 2011).Therefore, STEM curricula, especially scientific laboratory activities, have increasingly integrated collaboration as a key method ...

  12. Think:Kids : Collaborative Problem Solving in Schools

    The Results. Our research has shown that the Collaborative Problem Solving approach helps kids and adults build crucial social-emotional skills and leads to dramatic decreases in behavior problems across various settings. Results in schools include remarkable reductions in time spent out of class, detentions, suspensions, injuries, teacher ...

  13. What Is Collaborative Problem Solving and Why Use the Approach?

    The Collaborative Problem Solving Approach. The Collaborative Problem Solving (CPS) approach represents a novel, practical, compassionate, and highly effective model for helping challenging children and those who work and live with them. The CPS approach was first articulated in the widely read book, The Explosive Child [ 3 ], and subsequently ...

  14. Collaborative Problem Solving

    The PISA 2015 Collaborative Problem Solving assessment was the first large-scale, international assessment to evaluate students' competency in collaborative problem solving. It required students to interact with simulated (computer) in order to solve problems. These dynamic, simulated agents were designed to represent different profiles of ...

  15. Collaborative problem solvers are made not born

    Finally, at a higher level, collaborative problem-solving requires keeping the team organized - for example, by monitoring interactions and providing feedback to each other. Team members need ...

  16. CPS Connection

    Rather than focusing on kids' concerning behaviors (and modifying them), CPS helps kids and caregivers solve the problems that are causing those behaviors. The problem solving is collaborative (not unilateral) and proactive (not reactive). Research has shown that the model is effective not only at solving problems and improving behavior but ...

  17. Collaborative Problem-Solving

    Collaborative Problem Solving (CPS) is a manualized intervention based on cognitive-behavioral techniques, ... tools to reduce restrictive practices (eg, de-escalation techniques), involving patients and family in decision-making, and debriefing staff and patients after the use of restraint or seclusion. Seclusion declined by more than half as ...

  18. Collaborative Problem Solving: Processing Actions, Time, and

    This study is based on one collaborative problem solving task from an international assessment: the Xandar task. It was developed and delivered by the Organization for Economic Co-operation and Development Program for International Student Assessment (OECD PISA) 2015. We have investigated the relationship of problem solving performance with ...

  19. Collaborative Problem Solving® in Pediatric Primary Care

    Collaborative Problem Solving® is an evidence-based approach that provides caregivers with the skills to respond to challenging behavior. It promotes the understanding that children and youth with behavioral challenges lack the skill—not the will—to behave; specifically, skills related to problem-solving, flexibility, and frustration tolerance.

  20. Our Solution

    Collaborative & Proactive Solutions (CPS) is the evidence-based, trauma-informed, neurodiversity affirming model of care that helps caregivers focus on identifying the problems that are causing concerning behaviors in kids and solving those problems collaboratively and proactively. The model is a departure from approaches emphasizing the use of ...

  21. PDF Exploring collaborative problem solving in virtual laboratories: a

    Collaborative problem solving is a fundamental skill for qualied STEM profession-als (Jang, 2016). Solving complex problems in STEM education requires college ... dinated and mutual engagement in regulating the group's problem solving" (De Backer et al., 2015, p. 325). Socially shared metacognition is integral for collabora-

  22. Collaborative Strategies for Screen Time

    Collaborative Problem-Solving. Collaboration is essential to build trust between parents and children. The connection that is fostered by this intention allows for expansive and creative solutions ...

  23. Collaborative Robotics is prioritizing 'human problem solving' over

    AI will, naturally, be foundational to the company's promise of "human problem solving," while the move away from the humanoid form factor is a bid, in part, to reduce the cost of entry for ...