Business Simulation Game

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Australasian Journal of

Educational Technology
2004, 20(3), 275-294

A spreadsheet based simulator for experiential


learning in production management
Chuda Basnet and John L. Scott
University of Waikato

This paper presents a spreadsheet based simulation game for teaching and
learning production management concepts of forecasting, material
requirements planning, order review and release. In this game the student
plays the role of a production planner managing two products, for which
customer orders are placed in variable quantities throughout the week.
The student builds forecasting and material requirement planning systems
to help in the tasks of production and vendor order release. In parallel
with this, we have run a small learning awareness program, to test and
stimulate the skill of reflection. Initial student responses to the game have
been favourable, but the proportion of time spent on reflection is low.
Contemplated refinements are presented.

Introduction
Simulation games are activities designed to mimic the reality of the
external world, within the classroom, with the goal of instruction. The
learning is intended to be experiential - the student experiences the
studied phenomenon and learning proceeds inductively. Besides
simulation games, there are other means of providing the experience of
reality to students - case study, role playing, in basket method, and
incident process. The main advantage of simulation games over these
alternates is the dynamic nature of the games - the incorporation of the
time element, imitating the passage of time. Students have to live with
the results of their past decisions - the effects of these decisions persist
into the future in the game. Another advantage is the verisimilitude
offered - some games are able to provide a high level of make believe and
fantasising. The strong interest that is aroused in the subject matter is
itself of pedagogical value. A simulation game can be restarted with a
new strategy for playing the game, but a case study can only be used
once (Gilgeous and D'Cruz, 1996).
276 Australasian Journal of Educational Technology, 2004, 20(3)

While there are many top management games available for students and
teachers of management, the number of games aiming to teach/learn
specific management skills is small. There is also a dearth of games that
enhance detailed modelling and decision making capability. The primary
goal of the research presented in this paper is to develop a simulation
game that meets this void, in the specific area of management of
production planning and control, and to explore the learning
implications of the game. The paper describes the game that we
developed. This game is able to impart an understanding of the issues in
order-release; and as a secondary benefit, the spreadsheeting skills of the
students are enhanced. The paper also describes the learning experiences
of the students, with an emphasis on reflection, and their evaluation of
the simulation game.

Literature review
The first simulation game for teaching business management appears to
have been introduced in 1955. This game, called Monopologs, was
developed by the Rand Corporation for teaching logistics to U.S Air
Force personnel (Faria, 1990). In 1956, the American Management
Association introduced its Top Management Business Game, which was
meant for training top management, and included decisions on
production, marketing, assets, inventory, etc. The computations were
performed on an IBM 650 computer (Kibbee, Craft & Nanus, 1961). In
this game, the players filled a form indicating their decisions, this
information was punched into cards, and the computer program was
run. The computer provided performance reports, and the cycle was
repeated. By 1961, Kibbee et al. (1961) listed 31 computerised business
games, five of which were production simulators. Since then there has
been a steady increase in the number, sophistication, and adoption of
simulation games (Faria, 1987; McKenna, 1991; Burgess, 1991, Wolfe,
1993).

There is now a wide range in the complexity of simulation games: from


board games to computerised simulations. Even the simple act of
walking may serve as a simulation for instructional purposes (Wu, 1988)!
While a simulation mimics reality and is often used to predict what
would happen in a given scenario, the word “game” suggests playfulness
and competition. Simulation games combine these two characteristics.
Games that explore business strategies for the entire organisation are
called top management games, and games that have their primary focus
on a selected functional area of business are called functional games.
These functional games are available in the areas of accounting/finance,
marketing, production, and human resource management.
Basnet and Scott 277

The model of experiential learning provides a theoretical underpinning


of simulation games as a learning/teaching tool. Kolb's (1984)
experiential learning model is shown in Figure 1. According to this
model, concrete experience of a phenomenon in the real world triggers
the learning cycle. This event is observed/experienced, and
causes/encourages reflection in the student. The student forms/uses
abstract concepts and models/hypotheses to make sense of reality. This
leads to experimentation and hypothesis testing that provides concrete
experience, which starts the cycle again. Simulation games provide the
concrete experience needed in Kolb's model, and are a good platform for
stimulating learning awareness in students, encouraging them to better
understand their own learning processes (Scott, 2002).

Experience Reflection Hypotheses

Test of
Hypotheses

Figure 1: Experiential learning (EL) cycle model

Raia (1966), in his often cited study, carried out an experimental


comparison between a simple game, a complex game, and readings.
Boseman and Schellenberger (1974) used modified Raia's instruments to
compare students who did only cases against students who did cases and
games. Attempts were made to equalise the workload. They did not find
the interest of game players any higher than non-game players. Their
attitudes towards cases, management, course, and the instructors were
not significantly different. Similar results were obtained for perceived or
actual learning.

Wolfe and Guth (1975) made an experimental comparison between case


only and game only approaches to teaching business policy. In their
study, the game only students achieved a higher level of examination
scores than the case only students. Students in the game only section
achieved a higher degree of principle and concept mastery, but the
differences in fact mastery were not significant. However, their games
only section had a lot of structure - periodic reporting, class discussions
of events, review sessions, self appraisals, and diary of events. This
structure and guidance essentially closed the loop of experiential
learning (Figure 1) and may have contributed to the positive result as
278 Australasian Journal of Educational Technology, 2004, 20(3)

evidenced by a subsequent study (Wolfe, 1975), which did not have this
guidance, and had a negative result.

Parsuraman (1981) classifies the existing evaluation of simulation games


into three methodologies:

1. Experimental evaluation, where the students are split into different


groups which are exposed to different educational techniques. The
efficacy of the techniques is then judged by a common examination at
the end. Most of this type of evaluation has been inconclusive, and
Parsuraman criticised it on the grounds that the common
examinations test cognitive learning of the students, while simulation
games actually teach the "process" of decision making (and not
cognitive learning).

2. Correlation between students' performances in the simulation games


and their performances in other examinations and assignments in the
course. Here, Parsuraman questions the simulation game performance
as a measure of learning.

3. Surveys of student self reports. Generally, these surveys have shown


simulation games on a positive light, but the ability of the students to
judge the worth of games is questionable. Parsuraman concludes that
the evaluations should actually test the appearance of reality in the
games, and the worth of the types of decisions made in the games. He
suggests that experienced practitioners, not college students, are
better judges.

Ruohomäki (1995) discusses the use of simulation games from the


viewpoint of learning theory. A simulation game combines the features
of games (competition, cooperation, rules, participants, roles) and
simulation (abstraction of reality by a model). Simulation games are used
when there are no possibilities for students to obtain experience of the
systems or situations in the real life - where reality is too expensive,
complex, dangerous, fast, or slow. Ruohomäki identifies two purposes
for simulation games: 1) understanding of reality - to describe, analyse,
and evaluate realities, and 2) training - learn procedures, and carry them
to work activities. To achieve these, a simulation game should include
orientation to the game prior to the game, the game, a debriefing
consisting of reflections and observations, and forming concepts and
generalisations, and integration. Simulation games can provide an
opportunity for the active experimentation phase of the experiential
learning cycle. Participants can try out new solutions, and see the
probable consequences.
Basnet and Scott 279

According to Ruohomäki, simulation games provide: 1) cognitive


learning outcomes - information, principles, critical thinking, 2) attitude
changes toward the subject matter, society, and oneself, 3) increased
motivation and interest towards the subject, for doing research in that
field, and 4) positive effects on groups - better communication,
interactional skills, empathy for those in other roles. Simulation games
provide active learning (versus passive learning in lectures). Thus they
provide student centred learning, and emphasise learning by doing.

Finally, Lane (1995) in a review of simulation games has crystallised


some of the cautions to be exercised in the educational use of games:

1. Learning objectives. It is easy to be sold by the gimmicks and fun in the


games, but what will be learnt?
2. Supporting materials. There should be enough learning materials to
support the game's objectives.
3. Other pedagogical tools. It cannot be expected that simulation games
will serve all the needs of a course. At best they will supplement a
well-designed course.
4. Bells and whistles. So much seductive technology is available that it is
possible for the designers to be looking for appropriate topics for the
technology, rather than finding suitable technology for the teaching
objectives in hand. The use of animation, multimedia, and virtual
reality may provide more fun than education.
5. Complexity. It is better to have a simple game serving a specific
learning objective, than a complex game satisfying a number of
objectives. Students may find it hard to capture the desired
experience.
6. Briefing and debriefing. Evaluation of simulation games has shown time
and again the importance of proper briefing and debriefing. Students
cannot be left to decide on the rationale of the game, and to reflect on
what happened.
7. Facilitators. The facilitators need to understand the simulations and
learning objectives thoroughly. Games cannot teach by themselves.
8. Resources. Computer software is famously known for underestimating
the resource requirements. Games designers need to beware.
To conclude this literature review, in spite of many anecdotal success
stories and apparent student enthusiasm with games, the objective,
experimental evaluation of game based instruction, particularly as
compared with case based instruction, remains inconclusive. Obviously,
it is more difficult to control for the quality of games as compared to the
quality of cases, for the instructional style and structure, and for the
enthusiasm on the part of the students and the instructor. Post-game
counselling and review appears to make a definite positive impact on the
effectiveness of gaming. The literature generally supports the notion that
280 Australasian Journal of Educational Technology, 2004, 20(3)

students usually find gaming enjoyable in spite of the considerable time


taken up by it, and a well-conducted simulation game is at least not
worse than a case study in providing experiential learning. Therefore
judicious use of both cases and simulation games in the classroom, taking
heed of the experience of others, should bring out the benefit of both
techniques in the development of managerial skills.

Production management games


There are many production management games available for educational
purposes, but the number has not grown in keeping with the growth in
numbers for top management games and for marketing games. The
available games, such as Joblot (Churchill, 1970), PROSIM (Mize et al.,
1971), DECIDE-P/OM (Biggs, 1987) provide an understanding of
production systems more at the strategic level than at the tactical level, as
discrete event dynamic systems. There is a dearth of games designed to
teach specific technical skills in production management. Lane (1995) in a
pedagogical review of simulation games, referred to earlier, has
supported such a simple game serving a specific learning objective,
against a complex game satisfying a number of objectives. This paper
presents such a game, with the objective of learning about order release
in production management. A feature of this game is that students build
their own decision support system (DSS) to play it. Building a DSS
provides the game players with a detailed modelling and decision
making capability (Yeo and Nah, 1992). This DSS is based on a specific
model, material requirements planning (MRP). Teaching MRP is a goal of
the game.

Beginning students of production planning and control (PPC) often


struggle with the technical concepts in PPC such as bill of materials,
order review and release, and action buckets, to name just a few. The
students need to see how forecasting leads to master production
schedule, and to material requirements planning, and finally to order
release. An important objective for the students is to appreciate the
variability and dynamics of the production environment, where for
example even the forecast is not a static input to production planning. A
simulation game is ideal for this appreciation. Such a simulation game
would be a more specific and simpler game, compared to the larger
production games mentioned above, which attempt to give a flavour of
the entire production management function and even its strategic
significance.

In this paper we present a spreadsheet based production planning


simulator called MRP-SIM, designed to meet the above objective. The
next section presents a description of the game. This is followed by the
Basnet and Scott 281

learning awareness program we ran in parallel and a student evaluation


of the game. Finally, concluding remarks are presented.

The game
Objectives
The game is designed to enhance the understanding of PPC concepts
such as bill of materials, routing, order review and release, priority
setting, queuing, forecasting, master production scheduling (MPS),
material requirements planning (MRP), and capacity requirements
planning (CRP). These concepts are usually treated in isolation as
discrete concepts. The simulation brings out their interactions in a simple,
yet realistic setting.

Alpha Beta

Comma Delta Epsilon Gamma Delta Epsilon


(2) (1) (1) (1) (1) (2)

Fi Fi Fi Fi
(1) (2) (1) (2)

Figure 2: Bill of materials

The scenario
In this game the students play the role of the production planner of a
manufacturing company. They manage two products, Alpha and Beta,
for which customer orders are placed on the company in variable
quantities throughout the week. These products are made up of
components (Comma, Delta, Epsilon, Fi, and Gamma), some of which are
produced within the company, and others are sourced from vendors. In
carrying out the production, the parts are routed through processing
machines (Kappa, Mu, Pi, Rho, and Sigma) within the company, where
processing time is spent, and queues are built up. The bill of materials is
shown in Figure 2, and the routing and processing time information is
provided in Figure 3.
282 Australasian Journal of Educational Technology, 2004, 20(3)

Kappa Rho Alpha


1.2 min 0.6 min

Kappa Rho Beta


1.8 min 0.48 min

Purchased Material Comma


Vendor lead time = 1220 min

Mu Pi Sigma Delta
0.6 min 0.3 min 0.5 min

Sigma Mu Pi Epsilon
0.8 min 0.5 min 0.6 min

Purchased Material Fi
Vendor lead time = 1440 min

Purchased Material Gamma


Vendor lead time = 2000 min

Figure 3: Routing and processing time information


The production planning is done on a weekly basis. At the beginning of
the week, inventories are checked and orders are released both within the
company and to the vendors. Through the week the processing takes
place. Customer orders arrive based on a stochastic process that
simulates seasonality, trend, and randomness. The parameters of this
process are of course unknown to the students. As customer orders
arrive, the orders are filled from inventory on hand. Customer orders
may be filled partially if there is insufficient on hand inventory for the
whole order. Unfilled customer orders are placed on file and filled when
the product is available. The weekly cycle of activities is:
• release production and vendor orders,
• decide on overtime for the processes,
• simulate for a week (during this time these events occur: production
in the facility, order fulfilment from the vendor, order arrival from the
customer, and order fulfilment to the customer), and
• monitor the situation.
Basnet and Scott 283

The planner participates in this process beginning from week 20 and the
game ends after week 32. A simulator written in Visual Basic and
incorporated in an Excel spreadsheet simulates this scenario. The
students interact with the simulator in the spreadsheet environment.

Profits are accumulated for every item in the customer order that is filled.
For every item in the customer order that is late (filled after the day the
order arrives), a penalty is charged per day. There are also costs
associated with holding inventory (both finished and work in process)
and with overtime work.

Student task
To play the game, the students only need to make decisions on order
release and overtime on a weekly basis. The objective of the students is to
maximise their total profits at the end of week 32. After playing the game
for a while, students find out that ordering on an ad-hoc basis leads them
to financial ruin! They are asked to use past data for developing a
forecast, which, through the material requirements planning process,
should help them in deciding how many parts / products to order and
when. A capacity requirements planning module can help them decide
how much overtime to order. Their specific assignment is to create a
decision support system (DSS) for order release using these concepts.
They then use this DSS to play the game and see for themselves how
MRP works to facilitate accurate order release, in synchronisation with
the forecast demand (and to increase their financial performance!). They
create the DSS within the spreadsheet environment of MRP-SIM.
Students don't need to do programming in Visual Basic, but they need to
be proficient in using spreadsheet software. Learning to use spreadsheet
software is an additional goal of this assignment.

User interface
The main screen of MRP-SIM is shown in Figure 4. The upper left corner
shows the products currently being processed by the machines and their
queues (for example, the process Mu is currently processing 3000 units of
Epsilon). It also shows orders placed with the vendor. The lower part of
the screen shows the current inventory position. Current pending
customer orders are also shown. To play the game, the button Initialise
Game is used to initialise. This causes a history of demand to be created
up until week 20. The students can view the current inventory, work in
progress, and vendor orders at this time. Next they need to decide on the
orders to place for next week, their priorities, and overtime to authorise
for the next week. Once they have made the decisions, they communicate
it to the simulator by clicking on the Make / Alter Decisions button.
284 Australasian Journal of Educational Technology, 2004, 20(3)

Figure 4: The main screen

Once the decisions are entered, they press the Simulate! button to let the
production for the week begin, and to let the time advance to the next
week. The queues of the machines, the finishing of the work, the
inventory position, the arrival of customer orders, and the filling of the
customer orders are animated on the screen. Profits are accumulated for
every item in customer orders that is filled. For every late item a penalty
is charged per day. There are also costs associated with holding
inventory and with overtime work. The details of the model may be
viewed by pressing the View Model Details button. At the end of the week
the current and cumulative financial performance is shown at the top
right of the screen (Figure 4 shows that a net profit of $2062.25 was made
at the end of week 21. The students then make decisions for the next
week and repeat the cycle.

The assignment
Students are asked to play the game in an ad-hoc manner, without any
decision support, to familiarise them with the simulator and to see how
well they can perform without the forecasting, MRP, and CRP models.
They are then asked to build a DSS consisting of these models to help
Basnet and Scott 285

them play the game. In building the DSS they can create initial forecasts
from the 20 weeks of historical data. The forecasts and the MRP need to
be updated as new data becomes available. All this is done within the
spreadsheet environment. The students are asked to hand in their DSS (in
a diskette) and a semi-structured reflective essay on the game, their
experiences, and their understanding of the concepts.

The assignment consisted of three steps:

Step 1. Familiarisation with the simulator. The students play the game
in an ad hoc manner, guessing the decisions.

Step 2. Playing the game on a re-order point basis. The students try on
different levels of re-order points and fixed order quantities.

Step 3. Playing the game with a DSS, built by the students. To do this,
they use a forecasting model to forecast demand of the finished
products. This is fed into the master production schedule
(MPS), which it explodes into the MRP for the components. The
students then create the CRP model from the MRP model. This
completes the DSS, which suggests the order quantities for all
the items, and the overtime to order for all the processes.

Formal assessment for this assignment consists of reflective essay (50% of


the assignment marks) and the DSS (50%). The class was split into self
selected groups of 3 students. The development of the DSS was a group
activity, but the reflective essay was an individual task. The assessment
criteria were reflection breadth (number of activities or items discussed)
and reflection depth (thoroughness of discussion, depth being more
important than breadth), and completeness and correctness of the DSS
work.

Learning and the simulation


The EL cycle (Figure 1) underpins many processes (Scott, 1990), including
simulation. In the cycle, reflection is the stage that separates a simulation
experience from tying it to models and concepts used to improve
performance. While our students have some reflection skills, encouraged
by some familiarity in our first year course with simple concepts such as
Positive, Minus, Interesting (PMI) of Edward de Bono, and Single-loop
(SLL) versus Double-loop Learning (DLL) (Figure 5) of Chris Argyris
(Argyris,1977 ), reflection still appears to be the weakest link in their EL
cycle, for which no formal instruction is given. Also, simulation models
can be seen as devices to support reflection before action. We therefore
chose reflection as a parallel, learning focus for this study.
286 Australasian Journal of Educational Technology, 2004, 20(3)

Desire/ Action
Single Loop: Result/Consequence
Expectation

Compare

Desire/
Double Loop: Action
Expectation Result/Consequence

Compare
Expectation appropriate?
Action best available ?

Assumptions reasonable?

Figure 5: Single-loop versus double-loop learning

We decided to:
• track the main activities students used before and during each major
step within the MRP-SIM assignment, using activity logs
• create time maps from the activity logs
• record the levels of reflection witnessed
• stimulate learning awareness by discussing, in class, the results with
the students, before they finalised their reflective essays, which were
part of the assessment scheme.

Tracking the main activities was done using a semi-structured, one page
questionnaire we called an activity log (example in Appendix 1) – one for
each major step in the assignment (see above). Each log asked students to
record the time spent, and thinking behind, each significant task their
group had undertaken.

Creating time maps of the steps was simply done by graphing tasks
versus (proportion of) time spent, over the engagement. One class graph
for each significant MRP-SIM engagement was created, using different
colours for each student group.

With the MRP-SIM assignment requiring students to set levels of


production, observe the result/outcome, and revise their estimates,
potential levels of reflection were clearly related to Single-loop and
Double-loop Learning. Four levels were set: (0) trial and error, to no plan,
Basnet and Scott 287

(1) discussed/predicted what would happen before numbers were


entered (Sll), (2) questioned model or process being used to predict the
result/outcome (Dll), and (3) questioned the process being used in step
(2).

Experiences in using the game in a teaching context


This game and assignment combination was used in an operations
management course offered at the final year of a Bachelor of
Management Studies degree at the University of Waikato in Hamilton,
New Zealand. The assignment was a part of the module covering
production planning and control. The students were given the
assignment after covering the concepts of MRP in the class. It is
emphasised that we did not conduct a formal annual evaluation of the
game since we had no control over the sample size and little control over
when the course would be offered. Instead we assigned the game to the
students as opportunities arose to do so and we collected the students’
and our own experiences after the event for reflection and potential
modifications. At the end of a 1998 course, 10 of the 18 students taking
the course evaluated the game anonymously. After a 2002 course, 5 of the
6 students did a similar evaluation. There were differences between these
two offerings, as we tried to improve after the first instance of using the
game. One major difference was that the activity logs were only
instituted in 2002. In the following sections we offer the insights we
gained.

The assignment was distributed in class and the first lab session held, 14
days later. As “entry tickets”, the activity logs for ad-hoc and re-order
point running of the simulation were collected. The second lab session
was held, 2 days later. The two lab sessions were essentially help
sessions, offering individual help to the class members in creating their
DSS. Students can often be unsure of the concepts emphasised in this
assignment, or even the purpose of the DSS. Many students struggle to
acquire the level of spreadsheeting skills needed for the assignment. The
two lab sessions offered instructions in these matters.

The spreadsheet files and the DSS activity logs were then submitted for
assessment. Discussion of the results and feedback was given in class, 21
days before their reflective essay on the assignment was submitted. The
reflective essay was for three pages of thoughtful responses to reflective
questions we had provided.

There were six students taking this paper in 2002; they were given three
activity logs to fill in, corresponding to three ways, or steps, of playing
the game: in an ad hoc way, with reorder points for each items, and with
288 Australasian Journal of Educational Technology, 2004, 20(3)

the help of the DSS created by them. Students filled in the time they spent
on various activities. All possible activities were identified in the logs,
including the four dealing with reflection, but the students were not told
the learning levels of the logged items until the discussion of results and
feedback session. This session was deliberately held before the final
reflective essays were submitted.

Figure 6 shows the self-reported total time spent by students on the four
levels of learning in carrying out the assignment. Similar graphs were
also available for individual steps.

Activity Log of Students

300

250

Time spent (minutes)


Student 1
200 Student 2
Student 3
150
Student 4
100 Student 5
50 Student 6

0
0 1 2 3
Learning level

Figure 6: Self reported activity log for all steps

When presented with the learning levels of their activities and the time
they spent on activities, most of the students were surprised at the very
high time they spent on level 0 activities (trial and error or without a
plan), as against single loop or double loop learning.

Student evaluations
The students were asked to assess the simulation game in the context of
their normal end of course anonymous course and teacher assessment.
Their average ratings on a scale of (1 = strongly agree, 5 = strongly
disagree) is presented below in Table 1.
Basnet and Scott 289

Table 1: Student evaluation of the game


Average Average
Statement agreement agreement
rating 1998 rating 2002
Simulation gaming was instructive for learning 2.0 1.2
production planning
I enjoyed playing the simulation game 1.9 1.6
I put a lot of effort in playing the simulation game 2.1 1.4
The simulation game represents fairly well the 2.1 1.8
decision making faced by real production planners
I found the simulation game challenging 1.8 1.4
There was a strong sense of make believe in playing 2.2 2.0
the game
I felt the game enhanced my understanding of † 1.2
planning and control
The game helped me improve my use of spreadsheet † 1.4
software
The game assignment should be retained for next year 1.8 1.0
† Not included in the 1998 evaluation

Generally, the students found the assignment quite challenging. But they
felt that they learnt the MRP concepts pretty well. Since the students had
some familiarity with spreadsheets, the spreadsheet format of the game
helped in gaining student acceptance. Their evaluations bore testimony
to this. Some student comments, collected in 2002 from their reflective
essays, are given below.
Student A:
The next consideration was that the game opened my perspective to the
complexity of working within a production environment. It was insightful
to see all of the considerations that need to been (sic) thought of such as
overtime, holding costs and late delivery costs. Also the fact that at the
end of the game it was not full proof (sic, the student meant ‘fool-proof’)
in its recommendations supported the difficult nature of production
management.
Student B:
Therefore, using the DSS helped me play the game better and also my
profits had increased as well (i.e. started making profits rather than loss). I
believe the reason for improved results was due to taking more accurate
and precise figures into account when planning the future productions.
However, with DSS, decisions were still based on using my own judgment
and the game was still played with a great deal of uncertainty due to
reliance on forecasting figures. But to make it more realistic in the essence
to make the user of the MRP Simulation believe it is the real world and to
be able to imagine themselves in that environment, I think adding more
financial data such as how much overtime and late delivery is really
costing for each component (not just the total), and more costing
information would enrich a persons mind.
290 Australasian Journal of Educational Technology, 2004, 20(3)

Student C:
At the beginning I did not know where to start. However, with general
knowledge of production planning that I had learned from the lectures, I
continuously read and followed the instructions provided. At the same
time, I familiarised myself with the simulation software by using MS Excel
application to get a sense out of it, which was quite difficult for me. Then,
I found there were some parts of MRP that were separated by sheets or
tabs like; Bill of Materials, MRP / CRP, Routing, Order Release, and
Forecasting, which in the beginning I did not see how I can relate these
tasks together.

The comments indicate that the students did achieve a good


understanding of the production planning concepts, which was the main
goal of this exercise. Some students also developed an understanding of
the uncertainties involved and gained an appreciation of the fact that a
DSS provides decision support, but does not supersede human decision
making. Even though the students generally felt that the assignment
improved their spreadsheet skills, their skills before the assignment
ranged from a proficient level to a beginner’s level. This obviously
impacted their attitude to the spreadsheet work, which they found
overwhelming to underwhelming, depending on their previous
experience. Most students found that this assignment took up much more
time than they would have expected or wished, relative to the 4%
assessment weighting it carried.

Suggested improvements included making a competition out of it, with


the highest profit making students gaining rewards and high marks.
Another student (accounting) suggested more detailed reporting of the
financial outcome of the game.

Conclusion
In this paper we presented a spreadsheet-based simulator for teaching
and learning production planning and control concepts such as
forecasting, material requirements planning, order review and release.
The game received a favourable response from the students. The
spreadsheet format helped gain acceptance. Although the students were
not judged on their financial performance, they did develop a rivalry to
gain the highest profit. This substantially increased the motivation in
playing the game. A focus on levels of reflection attained added a parallel
learning focus.

This game enhances the understanding of PPC concepts as well as


providing the students with an opportunity to build a decision support
system that provide the game players with detailed modelling and
decision making capability, and to learn the difference between ad hoc
Basnet and Scott 291

decisions and model based decisions. Learning spreadsheet modelling is


an additional educational benefit from the game. Vaszonyi (1993) and
Plane (1994) present forceful arguments for using spreadsheets in
management science and operational research.

The low proportion of time spent on reflection – making sense of action,


connecting theory, and planning for further action – supports the view
that our students should benefit from greater understanding of reflection
and its praxis before playing the game. There are four aspects we have
identified that could be added to the learning awareness aspect in the
future:

• investigating if time spent developing spreadsheeting skills correlates


with reflection patterns witnessed;
• recording the order in which tasks were done, although this would
complicate recoding and analysis;
• relating time maps and ordering of tasks to learning styles;
• requiring more specific reflection on the reflective aspects in the final
essay;
• discussing with future classes taking the assignment, the outcomes
from this year, thereby seeding interest and awareness of reflection
pre-assignment.
Our study supports previous studies that have shown significant benefits
from using simulation gaming as a pedagogical tool. Our students did
achieve their learning objectives through the simulation. They also
expressed a positive attitude towards the “game”. Thus while we believe
that the simulation achieved its purpose, we emphasise that this was just
one of the many modules in this course. Other modules used other
learning techniques, such as debating, case studies, report writing, etc.
The simulation was particularly suited for its learning task, the other
techniques were chosen with their own goals in mind. We believe that
positive interactions between teaching and learning tools should be used
in this way to achieve the overall goals of a course. Further, such
interdependent interactions with the students can greatly enhance their
ability to learn in the outside, less structured environments. This has
particular significance for subjects aiming to transform students into
analysts and consultants.

The simulation model and the related assignment may be obtained by


writing to the authors. The simulation game is included in a list of
exemplar learning designs based on information and communication
technologies compiled by the Australian Universities Teaching
Committee. This can be accessed at the URL:
http://www.learningdesigns.uow.edu.au/exemplars/info/LD14/index.html
292 Australasian Journal of Educational Technology, 2004, 20(3)

References
Argyris, C. (1977). Double loop learning in organizations. Harvard Business
Review, Sept-Oct, 115-125.
Biggs, W. D. (1987). Functional business games. Simulation and Games, 18(2), 242-
267.
Boseman, F. G. and Schellenberger, R. E. (1974). Business gaming: An empirical
appraisal. Simulation and Games, 5(4), 383-402.
Burgess, T. F. (1991). The use of computerized management and business
simulation in the United Kingdom. Simulation and Gaming, 22(2), 174-195.
Churchill, G. (1970). Joblot: A Production Management Game. UK: Macmillan.
Faria, A. J. (1987). A survey of the use of business games in academia and
business. Simulation and Games, 18(2), 207-224.
Faria, A. J. (1990). Business simulation games after thirty years: Current usage
levels in the United States. In J. W. Gentry (Ed), Guide to Business Gaming and
Experiential Learning, 36-47. London: Nichols/GP Publishing.
Gilgeous, V. and D'Cruz, M. (1996). A study of business and management games.
Management Development Review, 9(1), 32-39.
Kibbee, J. M., Craft, C. J. and Nanus, B. (1961). Management Games: A New
Technique for Executive Development. New York: Reinhold.
Kolb, D.A. (1984). Experiential Learning: Experience as the Source of Learning and
Development. USA: Prentice-Hall.
Lane, D. C. (1995). On a resurgence of management simulations and games.
Journal of the Operational Research Society, 46, 604-625.
McKenna, R. J. (1991). Business computerized simulation: The Australian
experience. Simulation and Gaming, 22(1), 36-62.
Mize, J. H, Herring, B. E., Cook, C. L., Chun, M. S. and White, C. R. (1971).
Production System Simulator (PROSIM V): A User's Manual. USA: Prentice-Hall.
Parsuraman, A. (1981). Assessing the worth of business simulation games.
Simulation and Games, 12(2), 189-200.
Plane, D. R. (1994). Spreadsheet power. OR/MS Today, 20, 32-38.
Raia, A. P. (1966). A study of the educational value of management games. Journal
of Business, 39(3), 339-352.
Ruohomäki, V. (1995). Viewpoints on learning and education with simulation
games. In J. O. Riis (Ed), Simulation Games and Learning in Production
Management, 13-25. UK: Chapman and Hall.
Scott, J. L. (1990). OR methodology and the learning cycle. OMEGA International
Journal of Management Science, 18(5), 551-553.
Basnet and Scott 293

Scott, J. L. (2002). Stimulating awareness of actual learning processes, Journal of the


Operational Research Society, 53(1), 2-10.
Vaszonyi, A. (1993). Where we ought to be going: The potential of spreadsheets.
Interfaces, 23, 26-39.
Wolfe, J. and Guth, F. R. (1975). The case approach versus gaming in the teaching
of business policy: An experimental evaluation. Journal of Business, 48(3), 349-
364.
Wolfe, J. (1975). A comparative evaluation of experiential approach as a business
policy learning environment. Academy of Management Journal, 18, 442-452
Wolfe, J. (1993). A history of business teaching games in English-speaking and
post-socialist countries: The origination and diffusion of a management
education and development technology. Simulation and Gaming, 24(4), 446-463.
Wu, N. L. (1989). Understanding production through human simulation:
Experiencing JIC (just-in-case), JIT (just-in-time), and OPT (optimised-
production-technology) production systems, International Journal of Operations
and Production Management, 9(1), 27-34.
Yeo, G. K. and Nah, F. H. (1992). A participants' DSS for a management game
with a DSS generator, Simulation and Gaming, 23(3), 341-353.

Appendix 1: Activity log sample


0342.576 Advanced Operations Management. Assignment 1, Part B.
Production Planning at “Greek Manufacturing Company”

Individual ACTIVITY LOG for step 1: Playing the game in an ad-hoc manner
Mins Thinking associated (and
ACTIVITY:
spent outcomes)*
Read the assignment
Wrote down questions and thoughts
Played around inside the spreadsheet file
Put in some rough numbers into the
spreadsheet, just to see what would happen
Developed expectations about the results of
running this simulation
Compared simulation results with your
predictions/expectations
Openly questioned processes you were
using in forming your expectations
Sought further information about specific
things
Discussed the assignment with my group
Worked out numbers before running the
simulation
294 Australasian Journal of Educational Technology, 2004, 20(3)

Changed my/our approach ….. times


during the 32 weeks
Predicted what would happen before we
simulated
Questioned the processes being used to
‘Make/alter decisions’
Questioned the entire approach you were
using
Any other activities (state here)

* Think of the page as a map of what you did, with enough detail for the reader to
be able to understand all the activities you followed and your thinking behind it.
Tell it truthfully, as there is no “right answer” here, just a conscientious
completion of the picture of your activities.
Profit achieved in your best simulation run in an ad-hoc manner = $ ____

Read through the page now, speaking it out in your head as if you were telling
someone what you did, and the thinking behind it.

Chuda Basnet and John L. Scott


Department of Management Systems, University of Waikato
Private Bag 3105, Hamilton 2020, New Zealand
Telephone: +64 7 838 4562 Fax: + 64 7 838 4270
Email: [email protected], [email protected]
http://www.mngt.waikato.ac.nz/school/staff/staffhome.asp?ident=732&user=JLS

You might also like