QUANTITATIVE TECHNIQUES IN MANAGEMENT
Management Programme
SOLVED ASSIGNMENT
2010
QUANTITATIVE TECHNIQUES IN
MANAGEMENT
Amity University
Spl. Note:
Unauthorized copying, selling and redistribution of the content are strictly
prohibited. This material is provided for reference only
Page 1 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Assignment A
Question 1: How has quantitative analysis changed the current scenario
in the management world today?
Answer:
Quantitative analysis requires the representation of the problem using
a mathematical model. Mathematical modeling is a critical part of the
quantitative approach to decision making. Quantitative factors can be
measured in terms of money or quantitative units. Examples are
incremental revenue, added cost, and initial outlay.
Qualitative factors in decision making are the factors relevant to a
decision that are difficult to measure in terms o f money. Qualitative
factors may include: (1) effect on employee morale, schedule and other
internal elements; (2) relationship with and commitments to
suppliers; (3) effect on present and future customers; and (4) long-
term future effect on profitability. In some decision-making situations,
qualitative aspects are more important than immediate financial benefit
from a decision.
Different Statistical
Techniques
Measures of Central Tendency: For proper understanding of
quantitative data, they should b e classified and converted i n t o a
frequency distribution. This type of condensation of data reduces their
Page 2 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
bulk and gives a clear picture of their structure. If you want to know any
specific characteristics, of the given data or if frequency distribution
of one set of data to be compared with another, then it is necessary that
the frequency distribution itself must be summarized and condensed in
such a manner that it must help us to make useful inferences about the
data and also provide yardstick for comparing different sets of data.
Measures of Dispersion: Measures of dispersion would tell you
the number of values, which are substantially different from the
mean, median or mode. The commonly used measures of dispersion are
range, mean deviation and standard deviation.
Correlation: Correlation coefficient measures the degree to which
the change in one variable (the dependent variable) is associated with
change in the other variable (Independent one). For example, as a
marketing manager, you would like to know if there is any relation
between the amounts of money you spend on advertising and the sales
you achieve. Here, sales are the dependent variable and advertising
budget is the independent variable. Correlation coefficient, in this case,
would tell you the extent of relationship between these two
variables, whether the relationship is directly proportional (i.e.
Increase o r decrease in advertising is associated with increase or
decrease in sales) or it is an inverse r e l a t i o n s h i p (i.e.
Increasing a d v e r t i s i n g is associated w i t h decrease in sales and
vice-versa) or there is no relationship between the two variables.
Regression Analysis: Regression analysis includes any techniques for
modeling and analysing several variables, when the focus is on the
relationship between a dependent variable and one or more independent
variables. Using this technique you can predict the dependent variables
on the basis of the independent variables. In 1970, NCAER (National
Council of Applied and Economic Research) predicted the annual stock of
scooters using a regression model in which real personal disposable
income and relative weighted price index of scooters were used as
independent variable.
Time Series Analysis: With time series analysis, you can isolate and
measure the separate effects of these forces on the variables.
Examples of these changes can be seen, if you start measuring increase
in cost of living, increase of population over a period of time, growth of
agricultural food production in India over the last fifteen years, seasonal
requirement of items, impact of floods, strikes, and wars so on.
Index Numbers: An index number is an economic data figure reflecting
price or quantity compared with a standard or base value. The base
usually equals 100 and the index number is usually expressed as 100
times the ratio to the base value. For example, if a commodity costs
Page 3 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
twice
Page 4 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
as much in 1970 as it did in 1960, its index number would be 200 relative to
1960. Index numbers are used especially to compare business activity, the
cost of living, and employment. They enable economists to reduce unwieldy
business data into easily understood terms.
Sampling and Statistical Inference: In many cases due to shortage of
time, cost or non-availability of data, only limited part or section of the
universe (or population) is examined to (a) get information about the
universe as clearly and precisely as possible, and (b) determine the
reliability of the estimates. This small part or section selected from the
universe is called the sample, and the process of selections such a section
(or past) is called sampling.
Example: Site selection process (quantitative and
qualitative factors)
While quantitative factors have been and will continue to be very
important in the site selection process, qualitative factors are also
critical in order to ensure that the company makes the best decision.
What are the most important quantitative and qualitative factors
evaluated by site selection advisors and companies when making a
decision regarding the location of a new or expanded operation? The list
will vary depending on type of facility (i.e. manufacturing, logistics,
research & technology, office), but most factors apply to all forms
of projects. Below is a summary of the most important quantitative
and qualitative factors
Considered by companies.
Quantitative Factors
1. Property Tax Rates
2. Corporate Income Tax Rates
3. Sales Tax Rates
4. Real Estate Costs
5. Utility Rates
6. Average Wage/Salary Levels
7. Construction Costs
8. Worker’s Compensation Rates
9. Unemployment Compensation Rates
10. Personal Income Tax Rates
11. Industry Sector Labour Pool Size
12. Infrastructure Development Costs
13. Education Achievement Levels
14. Crime Statistics
15. Frequency of Natural Disasters
16. Cost of Living Index
17. Number of Commercial Flights to Key Markets
18. Proximity to Major Key Geographic Markets
19. Unionization Rate/Right to Work versus Non-Right to Work State
20. Population of Geographic Area
Page 5 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Qualitative Factors
1. Level of Collaboration with Government, Educational and Utility Officials
2. Sports, Recreational and Cultural Amenities
3. Confidence in Ability of All Parties to Meet Company’s Deadlines
4. Political Stability of Location
5. Climate
6. Availability of Quality Healthcare
7. Chemistry of Project Team with Local and State Officials
8. Perception of Quality of Professional Services Firms to Me e t the
Company’s Needs
9. Predictability of Long-term Operational Costs
10. Ability to Complete Real Estate due Diligence Process Quickly
Another important part of the site selection evaluation process relates to
the weighting of the key quantitative and qualitative factors. Depending
on the type of project, factors will be weighted differently. As an example,
for a new manufacturing facility project, issues such as utility rates, real
estate costs, property tax rates, collaboration with governmental entities,
and average hourly wage rates may be weighted more heavily. By
contract, for a new office facility factors such as real estate costs, number
of commercial flights, crime statistics, and climate and industry sector
labour pool size may be more important.
Every project is unique and must be evaluated based upon its own
individual set of circumstances.
Question 2: What are sampling techniques? Briefly explain the
cluster sampling technique.
Answer:
A sample is a group of units selected from a larger group (the population). By
studying the sample, one hopes to draw valid conclusions about the larger
group.
A sample is generally selected for study because the population is too
large to study in its entirety. The sample should be representative of the
general population. This is often best achieved by random sampling. Also,
before collecting the sample, it is important that one carefully and
completely defines the population, including a description of the members
to be included.
A common problem in business statistical decision-making arises when we
need information about a collection called a population but find that the
cost of obtaining the information is prohibitive. For instance, suppose we
need to know the average shelf life of current inventory. If the inventory is
large,
Page 6 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
the cost of checking records for each item might be high enough to cancel
the benefit of having the information. On the other hand, a hunch about
the
Page 7 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
average shelf life might not be good enough for decision-making
purposes. This means we must arrive at a compromise that involves
selecting a small number of items and calculating an average shelf life as
an estimate of the average shelf life of all items in inventory. This is a
compromise, since the measurements for a sample from the inventory will
produce only an estimate of the value we want, but at substantial savings.
What we would like to know is how "good" the estimate is and how much
more will it cost
to make it "better". Information of this type is intimately related to
sampling techniques.
Cluster sampling can be used whenever the population is homogeneous
but can be partitioned. In many applications the partitioning is a result of
physical distance. For instance, in the insurance industry, there are small"
clusters" of employees in field offices scattered about the country. In such a
case, a random sampling of employee work habits might not require travel
to many of the" clusters" or field offices in order to get the data. Totally
sampling each one of a small number of clusters chosen at random can
eliminate much of the cost associated with the data requirements of
management.
Question 3: What is the significance of Regression Analysis? How does
it help a manager in the decision making process?
Answer:
Regression analysis is a powerful technique for studying relationship between
dependent variables (i.e., output, performance measure) and independent variables
(i.e., inputs, factors, decision variables). Summarizing relationships among the
variables by the most appropriate equation (i.e., modeling) allows us to predict or
identify the most influential factors and study their impacts on the output for any
changes in their current values.
Unlike the deterministic decision-making process, such as linear optimization
by solving systems of equations, Parametric systems of equations and in
decision making under pure uncertainty, the variables are often more numerous
and more difficult to measure and control. However, the steps are the same. They
are:
1. Simplification
2. Building a decision model
3. Testing the model
4. Using the model to find the solution:
Ø It is a simplified representation of the actual situation
Ø It need not be complete or exact in all respects
Ø It concentrates on the most essential relationships and ignores the
less essential ones.
Ø It is more easily understood than the empirical (i.e., observed)
Page 8 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Situation, and hence permits the problem to be solved more
readily with minimum time and effort.
Page 9 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
5. It can be used again and again for similar problems or can be modified.
Fortunately the probabilistic and statistical methods for analysis and decision
making under uncertainty are more numerous and powerful today than ever before.
The computer makes possible many practical applications. A few examples of
business applications are the following:
Ø An auditor can use random sampling techniques to audit the
accounts receivable for clients.
Ø A plant manager can use statistical quality control techniques to assure
the quality of his production with a minimum of testing or inspection.
Ø A financial analyst may use regression and correlation to help understand
the relationship of a financial ratio to a set of other variables in business.
Ø A market researcher may use test of significace to accept or reject the
Hypotheses about a group of buyers to which the firm wishes to sell
a particular product.
Ø A sales manager may use statistical techniques to forecast sales for
the coming year.
Question 4 Explain the following terms in detail (give examples
where necessary): -
(a.) Arithmetic mean
(b.) Harmonic mean
(c.) Geometric mean
(d.) Median
(e.) Mode
Answer:
(a.) Arithm etic Mean:
The arithmetic mean (or the average, simple mean) is computed by
summing all numbers in an array of numbers (xi) and then dividing by the
number of observations (n) in the array.
Mean = = Xi / n, the sum is over all i's.
The mean uses all of the observations, and each observation affects the
mean. Even though the mean is sensitive to extreme values; i.e., extremely
large or small data can cause the mean to be pulled toward the
extreme data; it is still the most widely used measure of location. This is
due to the fact that the mean has valuable mathematical properties that
make it convenient for use with inferential statistical analysis. For example,
the sum of the deviations of the numbers in a set of data from the mean is
zero, and the sum of the squared deviations of the numbers in a set of data
from the mean is the minimum value.
(b) Harmonic M ean:
Page 10 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
The harmonic mean (H) is another specialized average, which is useful in
averaging variables expressed as rate per unit of time, such as mileage
per hour, number of units produced per day. The harmonic mean (H) of n
non- zero numerical values x (i) is: H = n/ [ (1/x (i)].
An Application: Suppose 4 machines in a machine shop are used to
produce the same part. However, each of the four machines takes 2.5, 2.0,
1.5, and
6.0 minutes to make one part, respectively. What is the average rate
of speed?
The harmonic means is: H = 4/[(1/2.5) + (1/2.0) + 1/(1.5) + (1/6.0)] =
2.31 minutes.
If all machines working for one hour, how many parts will be produced?
Since four machines running for one hour represent 240 minutes of
operating time, then: 240 / 2.31 = 104 parts will be produced.
(C.) The Geometric Mean:
The geometric mean (G) of n non-negative numerical values is the nth root
of the product of the n values.
If some values are very large in magnitude and others are small, then the
geometric mean is a better representative of the data than the simple average.
In a "geometric series", the most meaningful average is the geometric mean
(G). The arithmetic mean is very biased toward the larger numbers in the
series.
An Application: Suppose sales of a certain item increase to 110% in the
first year and to 150% of that in the second year. For simplicity, assume you
sold
100 items initially. Then the number sold in the first year is 110 and the number
Sold in the second is 150% x 110 = 165. The arithmetic average of 110% and
150% is 130% so that we would incorrectly estimate that the number sold in the
first year is 130 and the number in the second year is 169. The geometric
mean of 110% and 150% is G = (1.65)1/2 so that we would correctly estimate
that we would sell 100 (G)2 = 165 items in the second year.
(D.) M edian:
Median: The median is the middle value in an ordered array of
observations. If there is an even number of observations in the array, the
median is the average of the two middle numbers. If there is an odd
number of data in the array, the median is the middle number.
Page 11 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
The median is often used to summarize the distribution of an outcome. If
the distribution is skewed, the median and the interquartile range (IQR)
may be
Page 12 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
better than other measures to indicate where the observed data
are concentrated.
Generally, the median provides a better measure of location than the mean
when there are some extremely large or small observations; i.e., when the
data are skewed to the right or to the left. For this reason, median income
is used as the measure of location for the U.S. household income. Note that
if the median is less than the mean, the data set is skewed to the right. If
the median is greater than the mean, the data set is skewed to the left. For
normal population, the sample median is distributed normally with m = the
mean, and standard error of the median (p/2) times standard error of the
mean.
The mean has two distinct advantages over the median. It is more stable,
and one can compute the mean based of two samples by combining the
two means.
( D . ) M o de :
The mode is the most frequently occurring value in a set of observations.
Why use the mode? The classic example is the shirt/shoe manufacturer who
wants to decide what sizes to introduce. Data may have two modes. In this
case, we say the data are bimodal, and sets of observations with more than
two modes are referred to as multimodal. Note that the mode is not a
helpful
measure of location, because there can be more than one mode or even no
Mode.
When the mean and the median are known, it is possible to estimate the
mode for the unimodal distribution using the other two averages as
follows:
Mode » 3(median) - 2(mean)
This estimate is applicable to both grouped and ungrouped data sets.
Question 5: Explain the classical approach to the probability theory.
Also explain the limitation of classical definition of probability.
Answer:
The classical approach to probability is to count the number of favorable
outcomes, the number of total outcomes (outcomes are assumed to be
mutually exclusive and equiprobable), and express the probability as a
ratio of these two numbers. Here, "favorable" refers not to any subjective
value given to the outcomes, but is rather the classical terminology
used to indicate that an outcome belongs to a given event of interest. What
is meant by this will be made clear by an example, and formalized
with the introduction of axiomatic probability theory.
Page 13 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Classical definition of probability
If the number of outcomes belonging to an event E is NE, and the total
number of outcomes is N, then the probability of event E is defined
as.
Limitation of classical definition of probability
There are basically four types of probabilities, each with its limitations. None of
these approaches to probability is wrong, per se, but some are more useful or
more general than others.
In everyday speech, we express our beliefs about likelihoods of events using
the same terminology as in probability theory. Often, this has nothing to do with
any formal definition of probability, rather it is an intuitive idea guided by our
experience, and in some cases statistics.
Probability can also be expressed in vague terms. For example, someone might say
it will probably rain tomorrow. This is subjective, but implies that the speaker
believes the probability is greater than 50%.
Subjective probabilities have been extensively studied, especially with regards to
gambling and securities markets. While this type of probability is important, it is
not the subject of this book. A good reference is "Degrees of Belief" By Steven Vick
(2002).
There are two standard approaches to conceptually interpreting probabilities.
The first is known as the long run (or the relative frequency approach) and the
subjective belief (or confidence approach). In the Frequency Theory of Probability,
probability is the limit of the relative frequency with which an event occurs in
repeated trials (note that trials must be independent).
Frequentists talk about probabilities only when dealing with experiments that are
random and well-defined. The probability of a random event denotes the relative
frequency o f occurrence o f an experiment’s outcome, when repeating the
experiment. Frequentists consider probability to be the relative frequency "in
the long run" of outcomes.
Physical probabilities, which are also called objective or frequency probabilities, are
associated with random physical systems such as roulette wheels, rolling dice and
radioactive atoms. In such systems, a given type of event (such as the dice
yielding a six) tends to occur at a persistent rate, or 'relative frequency', in a
long run of trials. Physical probabilities either explain, or are invoked to explain,
these stable frequencies. Thus talk about physical probability makes sense only
when dealing with well-defined random experiments. The two main kinds of theory
of physical probability are frequentist accounts (such as Venn) and propensity
accounts.
Relative frequencies are always between 0% (the event essentially never happens)
and 100% (the event essentially always happens), so in this theory as well,
Page 14 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
probabilities are between 0% and 100%. According to the Frequency Theory of
Probability, what it means to say that "the probability that A occurs is p%" is that if
Page 15 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
you repeat the experiment over and over again, independently and under
essentially identical conditions, the percentage of the time that A occurs will
converge to p. For example, under the Frequency Theory, to say that the chance
that a coin lands heads is 50% means that if you toss the coin over and over
again, independently, the ratio of the number of times the coin lands heads to the
total number of tosses approaches a limiting value of 50% as the number of tosses
grows. Because the ratio of heads to tosses is always between 0% and 100%, when
the probability exists it must be between 0% and 100%.
In the Subjective Theory of Probability, probability measures the speaker's "degree
of belief" that the event will occur, on a scale of 0% (complete disbelief that
the event will happen) to 100% (certainty that the event will happen). According to
the Subjective Theory, what it means for me to say that "the probability that A
occurs is
2/3" is that I believe that A will happen twice as strongly as I believe that A will not
happen. The Subjective Theory is particularly useful in assigning meaning to
the
probability of events that in principle can occur only once. For example, how
might one assign meaning to a statement like "there is a 25% chance of an
earthquake on
the San Andreas fault with magnitude 8 or larger before 2050?" (See Freedman
and
Stark, 2003, for more discussion of theories of probability and their application to
earthquakes.) It is very hard to use either the Theory of Equally Likely Outcomes or
the Frequency Theory to make sense of the
assertion.
Page 16 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
QUANTITATIVE TECHNIQUES IN
MANAGEMENT
Assignment B
Question 1: Write a note on decision making in management. How one
will take decision under risk and uncertainty.
Answer:
Decision-making is a crucial part of good business. The question then is
‘how is a good decision made?
One part of the answer is good information, and experience in interpreting
information. Consultation ie seeking the views and expertise of other
people also helps, as does the ability to admit one was wrong and change
one’s mind. There are also aids to decision-making, various techniques
which help to make information clearer and better analysed, and to add
numerical and objective precision to decision-making (where appropriate)
to reduce the amount of subjectivity.
Managers can be trained to make better decisions. They also need a
supportive environment where they won’t be unfairly criticised for making
wrong decisions (as we all do sometimes) and will receive proper support
from their colleague and superiors. A climate of criticism and fear stifles
risk- taking and creativity; managers will respond by ‘playing it safe’ to
minimise the risk of criticism which diminishes the business’ effectiveness in
responding to market changes. It may also mean managers spend too much
time trying to pass the blame around rather than getting on with running the
business.
Decision-making increasingly happens at all levels of a business. The Board
of Directors may make the grand strategic decisions about investment and
direction of future growth, and managers may make the more tactical
decisions about how their own department may contribute most effectively
to the overall business objectives. But quite ordinary employees are
Increasingly expected to make decisions about the conduct of their own
tasks, responses to customers and improvements to business practice.
This needs careful recruitment and selection, good training, and
enlightened management.
Types of Business Decisions
1. Programmed Decisions these are standard decisions which always
follow the same routine. As such, they can be written down into a series of
fixed steps which anyone can follow. They could even be written as
computer program
Page 17 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
2. Non-Programmed Decisions. These are non-standard and non-
routine. Each decision is not quite the same as any previous decision.
3. Strategic Decisions. These affect the long-term direction of the
business eg whether to take over Company A or Company B
4. Tactical Decisions. These are medium-term decisions about how to
implement strategy eg what kind of marketing to have, or how many
extra staff to recruit
5. Operational Decisions. These are short-term decisions (also called
administrative decisions) about how to implement the tactics eg which
firm to use to make deliveries.
Figure 1: Levels of Decision-Making
Figure 2: The Decision-Making Process
Page 18 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Page 19 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
The model in Figure 2 above is a normative model, because it illustrates
how a good decision ought to be made. Business Studies also uses
positive models which simply aim to illustrate how decisions are, in fact,
made in businesses without commenting on whether they are good or bad.
Linear programming models help to explore maximising or minimising
constraints e.g. one can program a computer with information that
establishes parameters for minimising costs subject to certain situations and
information about those situations.
Spread-sheets are widely used for ‘what if’ simulations. A very large
spread-sheet can be used to hold all the known information about, say,
pricing and the effects of pricing on profits. The different pricing
assumptions can be fed into the spread-sheet ‘modelling’ different pricing
strategies. This is a lot quicker and an awful lot cheaper than actually
changing prices to see what happens. On the other hand, a spread-sheet is
only as good as the information put into it and no spread-sheet can fully
reflect the real world. But it is very useful management information to know
what might happen to profits ‘what if’ a skimming strategy, or a penetration
strategy were used for pricing.
The computer does not take decisions; managers do. But it helps managers
to have quick and reliable quantitative information about the business as it
is and the business as it might be in different sets of circumstances. There
is, however, a lot of research into ‘expert systems’ which aim to replicate
the way real people (doctors, lawyers, managers, and the like) take
decisions. The aim is that computers can, one day, take decisions, or at
least programmed decisions (see above). For example, an expedition could
carry an expert medical system on a lap-top to deal with any medical
emergencies even though the nearest doctor is thousands of miles away.
Already it is
Page 20 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
possible, in the US, to put a credit card into a ‘hole-in-the-wall’ machine
and get basic legal advice about basic and standard legal problems.
Constraints on Decision-Making
Internal
Constraints
These are constraints that come from within the business
itself.
- Availability of finance. Certain decisions will be rejected because
they cost too much
- Existing Business Policy. It is not always practical to re-write
business policy to accommodate one decision
- People’s abilities and feelings. A decision cannot be taken if it
assumes higher skills than employees actually have, or if the decision is so
unpopular no-one will work properly on it.
External Constraints
These come from the business environment outside the
business.
- National & EU
legislation
- Competitors’ behaviour, and their likely response to decisions
your business makes
- Lack of technology
- Economic climate
Quality of Decision-Making
Some managers and businesses make better decisions than others.
Good decision-making comes from:-
1. Training of managers in decision-making skills. See Developing
Managers
2. Good information in the first place.
3. Management skills in analysing information and handling
Page 21 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
its shortcomings.
4. Experience and natural ability in decision-making.
Page 22 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
5. Risk and attitudes to risk.
6. Human factors. People are people. Emotional responses come
before rational responses, and it is very difficult to get people to
make rational decisions about things they feel very strongly about.
Rivalries and vested interests also come into it. People simply take
different views on the same facts, and people also simply make
mistakes.
Question 2: The Mumbai Cricket Club, a professional club for the
cricketers, has the player who led the league in batting average for many
years. Over the past ten years, Amod Kambali has achieved a mean
batting average of 54.50 runs with a standard deviation of 5.5 runs. This
year Amod played 25 matches and achieved an average of 48.80 runs
only. Amod is negotiating his contract with the club for the next year, and
the salary he will be able to obtain is highly dependent upon his ability to
convince the team’s owner that his batting average this year was
not significantly worse than in the previous years. The selection
committee of the club is willing to use a 0.01 significance level.
You are required to find out whether Amod’s salary will be cut next year.
Answer:
Null Hyopothesis -Ho: Amod’s batting average this year (48.80)
Is not significantly different from his all-time batting average of
54.50
Alternative Hypothesis -Ha: Amod’s batting average this year
(48.80) is significantly lower than his all-time batting average of
54.50
α = 0.01
48.80 - 54.50
t= = -5.1818
5.5 / 25
The critical value of t is -2.492 at df = 24
Conclusion: Reject Ho and accept Ha (Amod’s batting average
this year is significantly lower than his all-time batting
average. Amod’s salary will most likely be cut next year.
Page 23 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
3 The salaries paid to the managers of a company had a mean of Rs.
20,000 with a standard deviation of Rs 3,000, W hat will be the mean
and standard deviation if all the salaries are increased by
1) 10%
2) 10% of existing mean
3) W hich policy would you recommend if the management does not
want to have increased disparities of wages?
Page 24 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Page 25 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Case study
Please read the case study given below and answer questions
given at the end.
Kushal Arora, a second year MBA student, is doing a study of companies going
public for the first time. He is curious to see whether or not there is a
significant relationship between the sizes of the offering (in crores of rupees)
and the price per share after the issue. The data are given below:
Size (in 108 39 68.40 51 10.40 4.40
crore of
rupees)
Price ( in 12 13 19 12 6.50 4
rupees)
Question
You are required to calculate the coefficient of correlation for the above data
set and comment what conclusion Kushal should draw from the sample.
Answer:
N X Y XY X
2
Y
2
1 12 108 1296 144 11664
2 13 39 507 169 1521
3 19 68.4 1299.6 361 4678.56
4 12 51 612 144 2601
5 6.5 10.4 67.6 42.25 108.16
6 4 4.4 17.6 16 19.36
TOTALS 66.5 281.2 3799.8 876.25 20592.08
Page 26 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
6(3799.8) - (66.5)
r= = 0.67
(281.2)
[6(876.25) - (66.5) 2
]
[6(20592.08) - (281.2) 2
Conclusion: There is a positive correlation for the above set of data
1. A survey to collect data on the entire population is
Options
a census
a sample
a population
an inference
2. A portion of the population selected to represent the
population is called
Options
statistical inference
descriptive statistics
a census
a sample
3. Qualitative data can be graphically represented by
using a(n)
Options
histogram
Page 27 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
frequency polygon
ogive
bar graph
4. Fifteen percent of the students in a school of Business
Administration are majoring in Economics, 20% in
Finance, 35% in Management, and 30% in Accounting.
The graphical device(s) which can be used to present
these data is (are
Options
a line graph
only a bar graph
only a pie chart
both a bar graph and a pie chart
5. A histogram is
Options
a graphical presentation of a frequency or relative frequency
distribution
a graphical method of presenting a cumulative frequency or a
cumulative relative frequency distribution
the history of data elements
the same as a pie chart
6. The mean of a sample
Options
is always equal to the mean of the population
is always smaller than the mean of the population
Page 28 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
is computed by summing the data values and dividing the sum by
(n - 1)
is computed by summing all the data values and dividing the sum
by the number of items
7. The median of a sample will always equal the
Options
mode
mean
50th percentile
all of the above answers are correct
8. The difference between the largest and the smallest
data values is the
Options
variance
interquartile range
range
coefficient of variation
9. The most frequently occurring value of a data set is
called the
Options
range
mode
Page 29 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
mean
median
10. The heights (in inches) of 25 individuals were
recorded and the following statistics were calculated
Mean = 70 range = 20
Mode = 73 variance = 784
Median = 74
The coefficient of variation equals
Options
11.2%
1120%
0.4%
40%
11. The standard deviation of a sample of 100
observations equals 64. The variance of the sample
equals
Options
8
10
6400
4,096
12. Which of the following is not a measure of dispersion?
Options
Page 30 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
the range
the 50th percentile
the standard deviation
the interquartile range
13. The coefficient of variation is
Options
the same as the variance
the standard deviation divided by the mean times 100
the square of the standard deviation
the mean divided by the standard deviation
14. A numerical measure of linear association between
two variables is the
Options
variance
coefficient of variation
correlation coefficient
standard deviation
15. The coefficient of correlation ranges between
Options
0 and 1
-1 and +1
minus infinity and plus infinity
Page 31 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
1 and 100
16. The model developed from s ample data that has the
form of
Options
regression equation
correlation equation
estimated regression equation
regression model
17. A regression analysis between sales (Y in $1000) and
advertising (X in dollars) resulted in the following
equation
The above equation implies that an
Options
increase of $4 in advertising is associated with an increase of
$4,000 in sales
increase of $1 in advertising is associated with an increase of $4 in
sales
increase of $1 in advertising is associated with an increase of
$34,000 in sales
increase of $1 in advertising is associated with an increase of
$4,000 in sales
18. Regression analysis is a statistical procedure for
developing a mathematical equation that describes how
Options
one independent and one or more dependent variables are related
Page 32 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
several independent and several dependent variables are related
one dependent and one or more independent variables are related
None of these alternatives is correct
19.
Options
increase by 120 units
increase by 100 units
increase by 20 units
decease by 20 units
20. Which of the following is not present in a time series?
Options
seasonality
operational variations
trend
Page 33 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
cycles
21. The trend component is easy to identify by using
Options
moving averages
exponential smoothing
regression analysis
the Delphi approach
22. The forecasting method that is appropriate when the
time series has no significant trend, cyclical, or seasonal
effect is
Options
moving averages
mean squared error
mean average deviation
qualitative forecasting methods
23. If P (A) = 0.4, P (B | A) = 0.35, P (A B) = 0.69, then P
(B) =
Options
0.14
0.43
0.75
Page 34 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
0.59
24. If P (A) = 0.5 and P (B) = 0.5, then P (A B)
Options
is 0.00
is 1.00
is 0.5
None of these alternatives is correct.
25.A probability distribution showing the probability of x
successes in n trials, where the probability of success
does not change from trial to trial, is termed a
Options
uniform probability distribution
binomial probability distribution
hyper geometric probability distribution
normal probability distribution
26. Four percent of the customers of a mortgage company
default on their payments. A sample of five customers is
selected. What is the probability that exactly two
customers in the sample will default on their payments?
Options
0.2592
Page 35 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
0.0142
0.9588
0.7408
27. The centre of a normal curve is
Options
always equal to zero
is the mean of the distribution
cannot be negative
is the standard deviation
28.
Options
0.9091
0.4812
0.4279
0.0533
29. A tabular representation of the payoffs for a decision
problem is a
Options
decision tree
payoff table
matrix
sequential matrix
Page 36 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
30. A decision criterion which weights the payoff for each
decision by its probability of occurrence is known as the
Options
Payoff criterion
expected value criterion
probability
expected value of perfect information
31. The expected opportunity loss of the best decision
alternative is the
Options
expected monetary value
payoff
expected value of perfect information
None of these alternatives is correct.
32. An assumption made about the value of a population
parameter is called a
Options
hypothesis
conclusion
confidence
significance
33. The level of significance is the
Page 37 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Options
maximum allowable probability of Type II error
maximum allowable probability of Type I error
same as the confidence coefficient
same as the p-value
34. The level of significance in hypothesis testing is the
probability of
Options
accepting a true null hypothesis
accepting a false null hypothesis
rejecting a true null hypothesis
None of these alternatives is correct.
35. A soft drink filling machine, when in perfect
adjustment, fills the bottles with 12 ounces of soft drink.
Any over filling or under filling results in the shutdown
and readjustment of the machine. To determine whether
or not the machine is properly adjusted, the correct set of
hypotheses is
Ans H0: = 12 Ha: 12
Page 38 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
36. Whenever all the constraints in a linear program are
expressed as equalities, the linear program is said to be
written in
Options
Bounded form.
Feasible form.
Standard form.
Alternative form.
37. If nan artificial variable is present in the ‘basic
variable’ column of simplex table, then the solution is
Options
degenerate
infeasible
unbounded
none of the above
38. The solution to a transportation problem with m - row
and n - columns is feasible if number of positive
allocations are
Options
mxn
m+n
m+n-1
m+n+1
39.
Page 39 of 41
QUANTITATIVE TECHNIQUES IN MANAGEMENT
Ans exist for each variable in a linear programming problem
40. To find the optimal solution to a linear programming
problem using the graphical method
Options
Find the feasible point that is at the highest location.
Find the feasible point that is closest to the origin.
Find the feasible point that is the farthest away from the origin.
None of the alternatives is correct.
Page 40 of 41