Implementation and Impact of Authentic Learning in a Postgraduate Applied Physics Course

Implementation and Impact of Authentic Learning in a Postgraduate Applied Physics Course


CHAN Taw Kuei

Department of Physics, Faculty of Science, National University of Singapore

Name:    Dr CHAN Taw Kuei
Address: Department of Physics, Block S12 Level 2, Science Drive 3, Singapore 117551

Recommended Citation:
Chan T. K. (2020). Implementation and impact of authentic learning in a postgraduate applied physics course. Asian Journal of the Scholarship of Teaching and Learning, 10(2). 153-170.

View as PDF      Current Issue


In this work, authentic learning was implemented in a postgraduate applied physics course at the National University of Singapore (NUS), based on the guidelines and interpretation by Herrington and Herrington (2006). Authenticity in this work is based on the context of a post-MSc/PhD academic and research career that the students will likely embark on after their graduation. The practices of applied physics research and the discourse at an actual scientific conference are simulated using three main learning tasks that were incorporated into the course: (a) literature review on a specific topic related to the course, (b) creation of a group poster, and (c) delivering individual presentations. These expose students to the authentic tasks of performing the review of established research literature, articulating and discussing ideas and concepts in a research group setting, doing research poster design, and giving a concise presentation of a complete research work to colleagues and collaborators. Linear regression model analysis of student feedback scores over 10 academic years indicate that there is evidence, at 5% level or better, that these tasks have significantly improved the computed overall effectiveness score, as well as students’ perception of learning in terms of the enhancement of their thinking ability, the receiving of timely and useful feedback, the development of relevant research skills, as well as an increased interest in the course content. 

Keywords: Authentic learning, higher education, postgraduate education, physics education, physics education research, PER


The delivery of transferable skills to postgraduate students is increasingly important, due to the need for researchers that are not only skilled but also adaptable to the modern, dynamic research landscape within academia, research institutes and research-intensive industries. Graduates equipped with transferable skills may apply them to different research fields, which is advantageous in situations where funding bodies have rapidly changing research priorities, or where there is a fast-changing nature to a single research field. Countries such as Australia, the United States, and the United Kingdom (Gilbert et al., 2004) have issued common sets of guidelines to their universities regarding the transferable skills that all research students in higher degree programmes should be trained in. In particular, the UK Research Councils (RCUK) produced a joint statement listing 36 critical skills which students should be trained in (Bromley et al., 2007, listed in Appendix 1 on pp. 132-136). These are categorised into seven broad areas: (i) research skills and techniques, (ii) research environment, (iii) research management, (iv) personal effectiveness, (v) communication skills, (vi) networking and teamworking, and (vii) career management. Singapore does not have a similar set of guidelines that is issued to universities; in NUS, we have the Lifelong Learning (L3) initiative that states mainly a set of knowledge that students will gain, but mentions no explicit set of transferable skills that the initiative should deliver.

Authentic learning is inherently well-suited for skills training in postgraduate education, where authenticity in the learning tasks, contexts and environment allow graduate students to apply and adapt the skills and the knowledge gained in class to the tasks and problems they might face in a real-world setting. Authentic learning environments and tasks seek to “show students relevance and stimulate them to develop competencies that are relevant for their future professional or daily lives” (Gulikers et al., 2005). According to Herrington and Oliver (2000), schools and universities often deliver knowledge to students entirely in abstract terms and contexts. In the applied fields, knowledge delivered without providing any link to actual practice has little utility in real life, problem solving contexts “because this approach ignores the interdependence of situation and cognition. When learning and context are separated, knowledge itself is seen by learners as the final product of education rather than a tool to be used dynamically to solve problems” (Herrington & Oliver, 2000). Herrington further encouraged an approach where an authentic task or project constitute a major component to a university course. Through this task or project, students are made to practise skills such as teamwork, oral articulation, systematic presentation of ideas, interpersonal communication, as well as project management (Herrington & Herrington, 2006). Indeed, multiple higher education institutions in different countries recognise the importance of matching the skills set of graduates with industry requirements, and have developed conceptual frameworks that specifically aim to enhance the employability of higher education graduates through authentic leaning approaches (Botma et al., 2015; Ornellas et al., 2019). 

In Physics Education Research (PER), researchers have studied various aspects of teaching and learning in physics, such as the conceptual change in students while learning physics (Dewey et al., 1992), students’ belief about learning physics (Adams et al., 2006), problem solving skills (Leak et al., 2017) and even gesture analysis (Scherr, 2008). However, research into the teaching of postgraduate physics is far less common than that of undergraduate physics. Much of the discussions of transferable skills training at the postgraduate level tend to be general discussions of graduate education across all fields and are not specific to physics (e.g. Bromley et al., 2007; Gilbert et al. 2004; Parker, 2012; Cargill, 2004 etc.). 

Two articles on postgraduate PER stand out, however. The study by Leak et al. (2017) noted certain characteristics of how physics postgraduate students perform problem solving. For simpler, routine problems, students generally search online for resources or materials that will help them solve the problems as individuals. For more complex problems, they often seek help from their peers or people close to them. This is a result of the fact that postgraduate physics research is usually not done in isolation, but as part of a research group. The study by O’Byrne et al. (2008) surveyed the graduates of PhD physics degrees in Australia on their perceptions of the whether their physics education helped to prepare them for their current work. Several employers were also surveyed on their opinion of these physics graduates. For the graduates, the main positive feature of postgraduate physics education was the opportunity to take ownership of their PhD project within an environment that provided good supervision and collaborative support. When these aspects of their learning did not lead to good outcomes, they constitute the main reason why their learning experiences were negative. Employers indicated that there is a deficiency in oral and written communication skills in physics graduates and some were especially critical of physics graduates in this area. Both these studies emphasise the importance of interpersonal and communication skills as well as the opportunity for and experience in collaborative work that facilitate the research of postgraduate physics students, both during their PhD degrees as well as in their future research careers. 

In this work, the author describes the implementation of authentic tasks and assessments in a postgraduate applied physics course. The implementation is based on the framework and guidelines by Herrington and Herrington (2006), where they emphasised that it is the cognitive authenticity rather than the physical authenticity that is of prime importance in the design of authentic learning environments. The research question is whether such an implementation has significant quantitative impact on end-of-term student feedback scores. Linear regression analysis of selected end-of-term student feedback questions over 10 academic years was conducted to test whether there is a statistically significant improvement of the feedback scores when the course is switched into this authentic learning structure from a more traditional course structure.

Brief Description of the Course 

The course, PC5209 “Accelerator-based Material Characterization” is a postgraduate course offered by the Department of Physics at the National University of Singapore (NUS). It covers the topic of Ion Beam Analysis (IBA), where a particle accelerator is used to produce a stream of fast-moving charged particles, known as an ion beam, that probes the composition of material systems. The course has been taught a total of 10 times, from 2010 to 2019. The class consisted mostly of full-time Physics MSc. and PhD. students. The class sizes were small, typically ranging from 15 to about 30 students. Foreign students make up a large proportion of the class; many of them have little to no prior experience in the authentic tasks that they will undertake in this course.

In the original teaching format, the continuous assessment consisted of the mid-term test and an individual oral presentation based on a research article. The article was pre-selected and assigned to each individual student, and each student had to present the findings in the assigned article to the class with no requirement for collaborative work in a group setting. This approach is relatively passive and lacks authenticity in terms of the learning tasks and environment. The original teaching format was adopted for the first five years, and the new format was implemented in the subsequent five years.

Authentic Learning in Higher Education

Herrington and Herrington (2006) noted that it is common for institutions of higher learning to engage in the teaching of theoretical knowledge in a manner that is devoid of context. Knowledge is delivered by discipline experts in large lecture theatres, and learning is still largely in the form of the passive absorption of knowledge by individuals where collaborative learning is not required. This has resulted in the tendency of students to engage in absorption or memorisation of knowledge with the sole purpose of regurgitating information during examinations. The transferring of knowledge and skills to the real-world setting is rarely assessed and therefore students place little emphasis on them. In the applied fields, such an educational approach is increasingly out of touch with the needs of industries and prospective employers of university graduates. Herrington advocated an authentic form of learning where students not only gain the requisite knowledge, but also that the learning takes place in an environment where students learn to apply and adapt the knowledge gained in a real-world setting. 

Herrington then proposed the following guidelines for designing authentic learning environments in higher education:

  • Provide an authentic context that reflects the way the knowledge will be used in real life
  • Allow students to engage in authentic activities
  • Provide access to expert performances and the modelling of research processes
  • Encourage students to explore different perspectives
  • Allow for collaborative construction of knowledge
  • Provide opportunities for students to reflect on their learning
  • Incorporate opportunities for articulation and public presentation of argument
  • Allow for collaborative learning and provide scaffolding
  • Provide authentic assessment of student learning

In this work, the emphasis is to deliver academic research skills, rather than knowledge, to the students. My interpretation of the above guidelines is therefore based on skills delivery. In the following sections, I introduce the authentic learning framework that was implemented for PC5209, and describe my interpretation and adaptation of relevant guidelines that are most suitable for application to physics research.

Authentic Learning Approach

The authentic learning approach introduces a series of authentic tasks in a group setting, and simulates a conference-based authentic learning environment as part of the continuous assessment component of the course. To ensure that the approach is meaningful, the development of the approach involved the following: 

  • Defining the context of authentic learning
  • Simulating an authentic learning environment in physics research
  • Identifying the transferable skills to be delivered
  • Designing learning tasks
  • Ensuring authenticity of the learning tasks
  • Planning the implementation and timeline 
  • Providing scaffolding: pre-submission meetings
  • Designing assessment emphases and rubrics

Defining the Authentic Context

The context for authentic learning defines the learning tasks for the students that will provide training for the respective set of transferable skills. Appropriateness of the context depends on the nature of students that attend the course. 

We shall make the assumption that most of the postgraduate students in this course will continue with a research career in Physics after they graduate from the MSc. or PhD. programme. The context for authentic learning for these students is therefore based on what full-time researchers in Physics are expected to face, perform, or experience during the course of their future research careers:  

  • Researchers in applied sciences regularly work on projects in collaboration with their peers, both within their research institution as well as from external institutions. Teamwork and project management are essential skills.
  • Many areas of modern research are interdisciplinary in nature, involving collaborations with researchers from other fields. Regular meetings are conducted with the collaborators to discuss the progress of their work, as well as to articulate how individual works contribute to the project as a whole. Such explanations have to be performed in a clear and succinct manner, especially to collaborators from other fields. Here, clear articulation of ideas and interpersonal relationship management are essential skills.
  • Researchers also regularly attend conferences to present their work as well as interact with other researchers to exchange ideas or gain new insights in their research area. Presentation of their work may be in the form of posters or oral presentations. The essential skills here would be designing scientific posters and the ability to plan, create, and give systematic oral presentations.


Simulating an Authentic Learning Environment in Physics Research

In order to simulate an authentic environment where research skills are learnt in this course, we consider the future work environment that the students will encounter. Experimental physicists usually work in research centres, where research is conducted in the form of major projects, generally with the aim of either publishing in peer-reviewed journals or producing patents. Each research project typically involves a sub-group of people led by a principal investigator (PI). The individuals in these sub-groups usually work as a team on different aspects of the project; progress in the individual aspects is essential for the project to advance towards its eventual completion. During the course of their work, each individual physicist must engage in discussions and collaboration with colleagues, and periodically conduct oral presentations to update everyone in the research centre on the progress of his or her own work. Attending local and international conferences are also expected.

In my framework, the class is split into multiple groups. Each group is tasked to work as a team to conduct literature review of an aspect of the subject matter (Ion Beam Analysis), with different groups working on different aspects. This is analogous to the sub-groups in actual research centres, where I serve as the PI of every group. While working as a group, students gain practice in honing their interpersonal communication and project management skills. I also introduced learning tasks such as conference poster creation and conference-style oral presentations. The framework aims to simulate an authentic learning environment in the classroom which reflects the way the relevant research skills are expected to be used in the future workplace of experimental physicists.

Identifying Transferable Skills to be Delivered

Transferable skills for postgraduates should ideally be developed over the entire course of the MSc and PhD programme, and the training should be jointly conducted by the project supervisor and the academic courses each student is required to read. As this is only one part of a single postgraduate course, only a limited range of transferable skills training can be delivered. 

Out of the 36 critical skills as listed by the RCUK, we address the following five areas:

  • Discipline Knowledge: Can communicate knowledgeably about their research topic with supervisor and peers, and adept at debating concepts. Familiar with recent relevant literature.
  • Library Skills: Able to collect and record information in an organised and professional way. Able to conduct searches using appropriate online and offline resources.
  • Critical Writing: Able to communicate own research orally and in written reports. Able to explain their research at a range of levels appropriate for e.g. international conferences or non-specialist audiences. Able to produce clear and well-constructed presentations. Able to use slides, and PowerPoint confidently and easily in oral presentations.
  • Research Presentation: Able to present academic work at seminars and skills conferences fluently and confidently.
  • Team-working: Can work in teams (e.g. research groups) on complex projects and can both reflect on quality of teamwork and solve team-working problems as they arise.

Designing  Learning Tasks

With the context defined, learning tasks are designed that enable the training of students in the respective skills set. 

Students first form into groups, with each group performing the following tasks as part of their continuous assessment:

  • Task A: Use online search engines to perform literature review on the latest developments in a topic related to the course. The topic is kept broad and ill-defined. The aim is to expose students to actual case studies and applications of lecture content, to promote a broader view of the subject in the field of Ion Beam Analysis, as well as to illustrate interdisciplinary research applications.
  • Task B: From the articles obtained during the literature search, each group will identify relevant high-impact work, evaluate their suitability and impact to the assigned topic and create a conference poster with selected content that that best represent or illustrate the state-of-the-art development in that topic. Specifications such as poster size and orientation, as well as font type and size are provided. Posters that do not conform to these specifications will be rejected and the group will have to resubmit a new one that does.
  • Task C: Every student is to deliver an individual oral presentation on the content of a complete research article published in a peer-reviewed journal. This article is selected by the students from the results of the literature review, subjected to an evaluation of their suitability for a 10-minute presentation. Presentations will be conducted back-to-back for all students in each group, in the form of a conference-style contributed talks session, conducted during lecture hours. Each 10-minute presentation is followed by a 5-minute question-and-answer (Q&A) session. These timings are strictly enforced, similar to that in actual conferences.

Ensuring Authenticity of the Learning Tasks

The task of creating a conference poster is by itself a real-world practice; it is almost inevitable that students would have attended at least one research conference, both before and after their graduation, where it is likely they will have to create clear and systematic posters to be presented. By also incorporating group work, the aim is for students to gain experience in teamwork, interpersonal communication, as well as articulation of ideas to their peers during group meetings. Due to the breadth of each topic, their tasks become less well-defined, as there is no “correct” set of articles that are should be selected, students have to figure out which specific aspect of the topic to focus on, and there is no well-defined way for the students to frame their narrative.  

The individual oral presentation serves two purposes in authentic learning. First, each presentation constitutes an in-depth case study of an actual research performed. Students are exposed to the procedures and instrumentation involved during real experiments. The real-life research questions, complexities and problems encountered are typically described in the articles, along with the methods and steps of how they are resolved during experiments performed by actual researchers. This constitutes an illustration of how the knowledge gained during class is applied in a real-life setting. Next, students are to articulate their ideas clearly to an audience (i.e. their classmates) during oral presentations. Through preparation and practice for their presentations, the students gain some experience in articulating physical concepts in a clear and succinct manner, as well as producing a systematic and structured presentation to the entire class. Assessments are integrated within the tasks themselves, where students are graded during a conference-style poster session and contributed talk session.

Planning the Implementation and Timeline

The learning tasks and assessments are implemented in parallel with the regular lectures and tutorials. The implementation timeline according to the instructional weeks within a semester is described in Table 1.

Table 1 
Implementation timeline of authentic learning

A conference experience is simulated for the students, where they will conduct literature review, create a poster, attend a poster session, and deliver an oral presentation. The poster session is conducted in an active learning room where each group will display their poster electronically on a separate large-screen LED TV. Students may move around the room and mingle with other groups to look at different posters. On separate occasions, members of each group will give back-to-back oral presentations on a general theme to simulate conference-style contributed talk sessions. A session chair (the author of this Article) will conduct each session, and the time allocation for each talk is enforced. The Q&A session is open to the floor, but will consist of at least one question from the session chair. 

Feedback is provided to the students during the presentation and poster sessions, where I will present my remarks and provide suggestions on how they may improve on these essential skills.

Providing Scaffolding: Pre-submission Meetings

Students may not have prior experience with the learning tasks. In addition, the topics are very new to the students and they may not be able to conduct an effective literature review by themselves. It is therefore necessary to provide guidance and formative feedback along the way, for the group work to be conducted smoothly and with the correct focus. Ungraded “pre-submission” meetings are therefore conducted about the midway point (in Week 7) where I will meet with individual groups outside of lecture hours to understand their concerns, clarify their doubts on what they are expected to achieve in their group work. There are two major aims of the ungraded meetings: 

  1. To review the articles students have collected during the literature review, where I will comment on their relevance and sufficiency of content for the group poster and oral presentations. For groups who have went off-topic in their literature search, I will have a detailed discussion with them and point them to more appropriate articles.
  2. To ensure that each group is progressing in their work in a timely manner, to understand and address their concerns as well as any difficulties they encountered in their literature review, as well as provide suggestions to facilitate their work. 

Furthermore, each group receives a number of “baseline” articles of their topic, which serve as a set of standard references for research articles that are acceptable in topic and depth of content. These may in fact be used during the literature review under title, keywords, author or “citing articles” in search engines. Under rare circumstances where the literature search process fails, these articles serve as a safety net and students may create the poster and prepare the oral presentation based on content from these baseline articles. 

Designing Assessment Emphases and Rubrics

The assessment comprises two components: the group poster and the individual oral presentation, each constituting 15% of the course’s final grade. The requirements and grading emphases of each component were made known to students at the beginning of the semester.

For posters, the grading emphases are: 

  • Sufficiency and relevance of the new developments obtained during the literature review
  • Poster content, coherence, and clarity

For oral presentations, the grading emphases are:

  • Demonstration of the understanding of principles and concepts involved in the paper
  • Sufficiency of the motivation and content covered by the presentation
  • Clarity of the presentation

The grading rubrics are provided in Appendix A. All the students in each group were assigned the same grade for the poster, while the oral presentations were graded individually for every student.  

The aim of the grading is to assess the quality of the literature search and the sufficiency of content that it yields, whether the individual or the group has demonstrated an understanding of the research question and content they had worked on, as well as whether the motivation, significance and impact of the research have been clearly illustrated and articulated in the posters and oral presentations. 

Reflections of Practice: Challenges and Limitations

Students are expected to hold group meetings regularly to discuss the findings of their literature search and collate the disparate material into a coherent presentation in the poster. In the ideal situation, these meetings are held regularly, and each group progresses in their work throughout the semester. However, some students may have the tendency to leave their work to the last minute, have only one meeting late in the semester and then rush through the process, resulting in poor-quality work. The pre-submission meeting conducted at the midway point serves to prevent such a tendency. In addition, regular follow-up email exchanges and brief one-on-one meetings with individuals and group leaders are also useful to prompt students to complete their work in a timely manner.   

The group posters should ideally have a systematic illustration of the latest developments within their topic and contain a range of separate applications that collectively form a coherent presentation. However, students may decide to divide the poster into several disparate segments. Each student will produce their assigned segment independently, eliminating the need for group meetings on the collation of content for the poster and negating the teamwork component. To prevent this, it was clearly emphasised to the class that the posters should not simply comprise of a collection of independent and disparate segments, both at the outset of the course and again during the pre-submission meeting with individual groups.

Another challenge is the language barrier. A large proportion of the postgraduate students in Physics are foreign students whose first language is not English. For these students, this may be among the first few times that they have ever given a presentation in front of an audience (and in English). Students may face the screen during their entire presentation, read off the screen word by word, read off their handwritten or cell phone notes the entire time, or have poor pacing of their presentations due to experiencing difficulties in their pronunciation of English words. Such a lack in presentation skills tend to make the presentations difficult to understand, which can result in many students in the audience not paying attention to the talks, or skipping the sessions of other groups entirely. I have found no good way to address this shortcoming; the presentation sessions, however difficult to follow, serve as a training session for students to gain the confidence and experience in oral presentations. 

A major limitation is that the implementation of these tasks in the classroom is necessarily of limited authenticity. While the tasks are by themselves authentic, the implementation is not. A major reason stems from a lack of interest in the research topics. During the poster session, students may not be interested to move around to view the posters from other groups, resulting in a poster session where everyone simply stays put by their own posters. The lack of interest also results in the lack of motivation and effort while performing the learning tasks. This manifests quite clearly in the quality of the group posters, as well as in the extent of preparation for individual presentations. Therefore, sparking an interest in the topics and motivating students to participate in these tasks is essential for the implementation to be meaningful and the training of transferable skills to be effective. 

Another limitation is the lack of direct evidence of students gaining or improving in the stated skills set, as there is only one poster produced by each group and one oral presentation delivered per student. These tasks were performed while scaffolding was provided in the form of pre-submission meetings as well as personal guidance via online and in-person consultations. To obtain direct evidence that the training has been effective, it should be a requirement for students to do a second poster and a second presentation, where they have to perform these second set of tasks on a different topic with less scaffolding provided. These can then be compared with the first set as conclusive evidence of improvement. However, requiring students to perform a second set of tasks on a different topic is not feasible within a 13-week timeframe in a single semester, where regular lectures and tests are ongoing in parallel. 

One way to track skills proficiency is to have the transferable skills training delivered to students as a concerted effort, where appropriate authentic learning tasks are integrated coherently across different postgraduate modules. The proficiency of these skills may then be tracked as the students progress from one module to the next during their postgraduate studies.

Linear Regression Modelling of Student Feedback Scores

With the lack of direct evidence, an indication of the effectiveness of authentic learning lies with students’ perception of learning as reflected in the end-of-term student feedback scores. Here, the research question is whether the implementation of authentic learning has had a significantly beneficial effect on the scores of five different questions in the student feedback:

Question 1: The teacher has enhanced my thinking ability
Question 2: The teacher has provided timely and useful feedback
Question 3: The teacher has helped me develop relevant research skills/           
                     The teacher encouraged independent and self-directed learning
Question 4: The teacher has increased my interest in the subject
Question 5: Computed Overall Effectiveness Score / 
                     Overall, the teacher is effective 

We wish to investigate whether the average score for each question during the years after the implementation of the authentic learning is significantly higher than the average score during the years before the implementation. If the implementation of authentic learning is the only possible factor that may influence the scores, the Analysis of Variance (ANOVA) may be used. In ANOVA, the feedback scores of each question (e.g. overall effectiveness) over the 10 academic years may be divided into two groups: the first group of five scores before the implementation of authentic learning, and the second group of five scores after the implementation. We then calculate the various statistical parameters of each of these separate groups of data, and then perform an appropriate form of hypothesis testing to test whether the average score during the years after the implementation is significantly higher than the average score during the years before the implementation. 

However, one limitation of ANOVA analysis is that the data must be divided into separate groups. The criterion that is used to divide the data into these groups represents a variable with discrete states, e.g. before and after implementation. As such, ANOVA analysis is unable to take into account other potential influencing factors, such as the class size and the number of responses in each feedback exercise, which can have any values and do not follow discrete states. In order to take such variables into account, the more general method of linear regression analysis must be used. Here, linear regression is used in this work to consider the variable of the teaching format (0–before the implementation, and 1–after authentic learning has been implemented), the size of the class and the number of the respondents in the feedback exercise for each academic year. In our full regression model, we shall assume that the data (i.e. feedback scores) is a response variable that is a quantitative function of the predictor variables of teaching format, class size and response size.

We assume the following linear regression model:

Y= β+ βXFormat,i + βXClass,+ β3 XResponse,i + εi    ,   i = 1,2,…,10


Yi = The score on the Likert scale for a particular question    
XFormat,i  = Format of teaching (0 = Original format; 1 = Authentic Learning format)
XClass,i  = Class Size (Total number of students in the class)
XResponse,i = Response Size (Number of respondents in the feedback for that semester)
εi = Error term, assumed to follow a Normal distribution with zero mean and variance of σ2
i = Year (from 1 to 10, AY2010 – AY2019)

We wish to investigate whether the student feedback score for each of the five questions (i.e. the response Yi) is significantly affected by the predictors of teaching format (XFormat,i), the total class size (XSize,i), and the total number of the responses to the student feedback (XResponse,i). 

The first step is to calculate the regression coefficients of β0β1β2, and β3. We adopt the matrix formulation of linear regression: the intercept and predictors form the matrix X (Figure 1), while the five separate response variables corresponding to Questions 1 to 5 are expressed as the matrices of Yth, Yfeed, Yres, Yint, and Yeff respectively (Figure 2). The coefficients and error terms are written as β={β0,β1,β2,β3 }T and ε={ε12,…,ε10 }T respectively. The regression model is = Xβ + ε and the regression coefficients are evaluated as β = (XT X)-1 (XT Y).


Figure 1. The predictor matrix X.


Figure 2. The response variable matrices of  Yth, Yfeed, Yres, Yint, and Yeff, corresponding to Questions 1 to 5 respectively.

For every coefficient, we calculate the corresponding p-value that will allow us to gauge the significance of each coefficient. The p-value for a regression coefficient is used to decide whether that particular coefficient is significant, by conducting a hypothesis test. If the decision is that the coefficient is significant, then there is statistical evidence that the student feedback scores are significantly affected by the predictor variable that is associated with that coefficient. The commonly accepted practice is to consider the 5% level of significance, where we conclude that a coefficient is significant at the 5% level if the p-value is smaller than 0.05. Generally, the degree of significance is considered to be larger for smaller p-values. Hence, for every estimated coefficient β ̂1, we calculate its t-statistic and corresponding p-value for a two-tailed t-test with a degree of freedom: df= (Number of data points) - (Number of predictor variables) -1. The detailed steps of the regression process are provided in Appendix B. The p-values of the full-model regression coefficients are shown in Table 2.  

Table 2 
p-values of regression coefficients of the full model. The significant parameters are shaded.

The full model may not be meaningful, since not all of the predictors may be significant. A meaningful regression model must contain only predictor variables that are significant. However, it is not possible to simply drop all predictors with insignificant coefficients from the full model at once, since the presence (or absence) of any single predictor variable will change the p-values of all the remaining predictors, potentially affecting our decision on the state of their significance. Therefore, to determine the parsimonious models, model selection for every question is performed using the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). The AIC and BIC criteria are commonly used to provide a quantitative measure in the form of a score for the quality of each regression model. In both criteria, the smaller the score, the better the quality of the model. To this end, the model is re-fitted for every question using all possible permutations in the type and number of predictor variables. The AIC and BIC scores are then calculated for each permutation. The results are shown in Table 3.

Table 3 
Fitted models with all possible permutations of predictor variables, sorted according to decreasing BIC score. The parsimonious model to be selected is described by the lowest AIC and BIC scores, which are shaded.

For Questions 1, 2, 3, and 5, the AIC and BIC scores are in agreement: the models with the lowest scores are selected. For Question 4, the model selected by AIC and BIC differ. We note that these two models share similar values of AIC (i.e. -1.22 and -1.19, difference of 0.03), but the BIC values continue to decrease significantly from one model to the next (from -0.01 to -0.28, difference of 0.27). As such, we shall select the model with the lowest BIC score, i.e. the model with only the teaching format as the predictor. 

From the selected models, we conclude that:

  • For Questions 1, 2 and 4, the selected model contains only the teaching format as the predictor variable. Therefore, only the change in the format of teaching significantly affects the feedback scores. Since the corresponding predictor Xformat is categorical in nature (i.e. either 0 or 1 in value), these selected models are equivalent to the ANOVA analysis. 
  • For Questions 3 and 5, the selected model contains the predictors of teaching format and response size. Therefore, both are significant factors that cause the change in the feedback scores.
  • Class size is not included in the selected model of any of the questions, indicating that the effect of class size is not significant.

From Table 4, the β ̂1 coefficients are positive for all questions, indicating that there is evidence, at the 5% level or better, that adoption of the authentic learning format of teaching causes an increase in these feedback scores. Therefore, there is indeed a beneficial effect on students’ perceptions of their own learning process with regards to all the various areas covered by the five questions. In the framework of effect size d as adopted by Hattie (2009), the hinge point of d=0.4 is exceeded by every question.

Table 4 
Regression coefficients and their p-values for the respective selected models of each question. The selected models for Questions 1, 2 and 4 contain only the teaching format as the predictor, while the selected models of Questions 3 and 5 contain the predictors of teaching format and response size only. The degrees of significance are indicated as * p-value < 0.05, ** p-value < 0.01 and *** p-value < 0.001.


While all the coefficients are statistically significant (p-value < 0.05), the degree of significance of β ̂1 in Questions 2 and 3 are particularly high (p-value < 0.001). For Question 2, this may be attributed to the ungraded pre-submission meetings, and the on-site provision of and suggestions of improvement to students during the poster session and the oral presentations. Such a form of two-way communication may have resulted in the perception where the concerns and doubts of students were received and addressed by the lecturer, and that the students have received the lecturer’s feedback on their performance. For Question 3, the design of the learning tasks and their implementation in the form of group work directly promote independent and self-directed learning, which are both necessary research skills. The strong significance of β ̂1 for Question 5 on the overall effectiveness score also indicates that the authentic learning framework indeed improves overall perception of the course and the lecturer. 

On the contrary, the β ̂3 coefficients are significant but have negative values for Questions 3 and 5, indicating that the scores for these questions are adversely affected by the size of the response of student feedback exercise. This finding aligns with the well-established phenomenon where feedback scores are in general lower when there is a larger number of respondents in the student feedback exercise. 

Future Work

One possible extension of this work is to investigate whether there is evidence of an improvement in students’ academic performance over the 10-year period due to the implementation of the authentic learning framework. While the implementation is designed to deliver relevant research skills, it could potentially have improved students’ academic performance as well.  

Further augmentation of the authentic learning framework may also be achieved by leveraging on the existing group work component in this framework to promote peer learning of the main course content. Recently in AY2019/20, I began to adopt blended learning in the form of flipped classroom for this course, where face-to-face sessions are designed to allow for peer interaction and discussions among students as they work towards a common goal of solving problems. This is again an authentic process among researchers that work in a large research group or laboratory. An integration of flipped classroom with authentic learning may also have a cyclical reinforcement effect between the peer learning in class and the group work outside of class. However, it is still too early in its implementation to gauge its effectiveness.


An authentic learning framework with the aim of delivering transferable skills to students was implemented for a postgraduate Physics course, based on guidelines provided by Herrington and Herrington (2006). A context that is authentic for a career in physics research was defined and a learning environment was simulated in the classroom, based on the set of transferable skills that are intended to be delivered to the students in this course. Authentic learning tasks of literature review, poster creation, and oral presentations in the form of group work were designed and implemented over a planned timeline during the course of a single semester. Instructional scaffolding was provided in the form of pre-submission group meetings and rubrics were designed to deliver authentic assessment.

Linear regression analysis was conducted on end-of-term student feedback scores of five different survey questions over 10 academic years. The predictor variables considered are the teaching format, class size and the number of responses in each feedback exercise. Parsimonious models were then selected by considering the Akaike Information and the Bayesian Information Criteria. 

The results at the 5% level indicate that the teaching format has a significant, beneficial impact in all the survey questions, with the teaching format being the only significant factor for questions regarding the enhancement of thinking ability, the provision of timely feedback and the increase of interest in subject matter. The degree of significance of the teaching format predictor is particularly high for questions on the provision of research skills and timely feedback, suggesting that the implementation of authentic learning has likely provided a significant impact on students’ perception regarding these areas.


Adams, W. K., Perkins, K. K., Podolefsky, N. S., Dubson, M., Finkelstein, N. D., & Wieman, C. E. (2006). New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey. Physical Review Special Topics: Physics Education Research, 2(1), 010101.

Botma, Y., Van Rensburg, G. H., Coetzee, I. M., & Heyns, T. (2015). A conceptual framework for educational design at modular level to promote transfer of learning. Innovations in Education and Teaching International, 52(5): 499–509. 

Bromley, A. P., Boran, J. R., & Myddelton, W. A. (2007). Investigating the baseline skills of research students using a competency-based self-assessment method. Active Learning in Higher Education, 8(2), 117–137.

Cargill, M. (2004). Transferable skills within research degrees: A collaborative genre-based approach to developing publication skills and its implications for research education. Teaching in Higher Education, 9(1), 83–98. 

Dewey, I. D. H., Boyle, C. F., & Monarch I. A. (1992). Studying conceptual change in learning physics. Science Education, 76(6), 615–652.

Gilbert R., Ballati, J., Turner, P., & Whitehouse, H. (2004). The generic skills debate in research higher degrees. Higher Education Research & Development, 23(3), 375 – 388.

Gulikers, J. T. M., Bastiaens, T. J., & Martens, R. L. (2005). The surplus value of an authentic learning environment. Computers in Human Behavior, 21(3), 509–521.

Hattie, J. A. C. (2009). Visible learning–A synthesis of over 800 meta-analyses relating to achievement, New York: Routledge

Herrington, A. & Herrington, J. (2006). Authentic learning environments in higher education, Pennsylvania: Information Science Publishing

Herrington, J., & Oliver, R. (2000). An instructional design framework for authentic learning environments. Educational Technology Research and Development, 48(3), 23–48.

Leak, A. E., Rothwell, S. L., Olivera, J., Zwickl B., Vosburg, J., & Martin, K. N. (2017). Examining problem solving in physics-intensive PhD. Research. Physical Review Physics Education Research, 13(2), 020101.

O’Byrne, J., Mendex, A., Sharma, M., Kirup, L., & Scott, D. (2008, November 30-December 5). Physics graduates in the workforce: Does physics education help? [Conference presentation]. Australian Institute of Physics (AIP) 18th National Congress, Adelaide, Australia.

Ornellas, A., Falkner, K., & Stålbrandt, E. E. (2019). Enhancing graduates’ employability skills through authentic learning approaches. Higher Education, Skills and Work-Based Learning, 9(1), 107–120. 

Parker, R. (2012). Skill development in graduate education. Molecular Cell, 46(4), 377–381.

Scherr, R. E. (2008). Gesture analysis for physics education  

About the Author

CHAN Taw Kuei is a senior lecturer at the NUS Department of Physics. He began teaching at the Department in 2010 and joined as a lecturer in 2013. His research interests include materials characterisation using ion beam analysis methods, authentic and experiential learning, teacher immediacy, as well as theories of motivation in education psychology.