Quantitative Article

 

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

Article Reviews (Link for submission provided below)

  • Two: Quantitative and Qualitative research
  • Topic: Your choice but recommend choosing a topic that is in the same family as your expected dissertation topic in order to explore what is out there
  • No hard word counts or page requirements but rubric will be provided
  • Safe Assign will be used to track/monitor submission for plagiarism.
  • Submissions with a Safe Assign match of more than 25% will not be accepted
  • An example of a quantitative and qualitative article will be posted in the iLearn course page soon.

Please use APA formatting and include the following information:

  • Introduction/Background:  Provide context for the research article.  What led the author(s) to write the piece? What key concepts were explored? Were there weaknesses in prior research that led the author to the current hypothesis or research question?
  • Methodology:  Describe how the data was gathered and analyzed.  What research questions or hypotheses were the researcher trying to explore? What statistical analysis was used?
  • Study Findings and Results:  What were the major findings from the study? Were there any limitations?
  • Conclusions:  Evaluate the article in terms of significance, research methods, readability and the implications of the results.  Does the piece lead into further study? Are there different methods you would have chosen based on what you read? What are the strengths and weaknesses of the article in terms of statistical analysis and application? (This is where a large part of the rubric is covered.)
  • References 

JOURNAL OF EDUCATORS ONLINE

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

TEACHING QUANTITATIVE COURSES ONLINE:
ARE LEARNING TOOLS OFFERED BY

PUBLISHERS EFFECTIVE?
Mohammad Ahmadi, University of Tennessee-Chattanooga
Parthasarati Dileepan, University of Tennessee-Chattanooga

Kathleen Wheatley, University of Tennessee-Chattanooga

ABSTRACT

In recent years, online teaching has become extremely popular. Most institutions of higher learning
are offering online courses in almost every field of study. Teaching any course online is challenging, but
teaching quantitative courses, such as operations management, management science, statistics, and others,
have added a more challenging dimension to online teaching. Publishers have been assisting professors
of quantitative methods courses by developing various teaching and evaluation tools. This study explores
one such publisher’s tool, Quiz Me Mastery Points, of Pearson’s MyOmLab. The performance of students
on their examinations and the Mastery Points they earned through the Quiz Me feature were compared,
and it was determined that there was a significant correlation between the two.

Keywords: Online teaching, Quantitative courses, Quiz Me Mastery Points, MyOmLab

INTRODUCTION
In the last decade online teaching and learning

has become the norm in many institutions of higher
learning. Numerous institutions are offering online
courses both nationally and internationally. The
Online Consortium tracks online education in the
Unites States and releases an annual report entitled
The Online Report Card. The most recently
released report (Allen & Seaman, 2016) showed
there were more than 5.8 million students in the
United States enrolled in one or more online courses
in the fall of 2014. This constitutes 28.4% of all
student enrollment. The report further stated that
many academic leaders (63.31% in 2015) strongly
believe online learning is a critical component of
their long-term strategy. It also stated that 77.14%
of the chief academic officers in 2015 rated the
learning outcome of online education as good as
or better than face-to-face. However, an alarming
finding was that only 29.1% of the chief academic
officers believed their faculty accepted the value

and legitimacy of online education. These findings,
along with historic trends, reveal a mismatch
between the growth in student demand for online
course offerings and the hesitancy of faculty to buy
into the efficacy of online teaching. Reconciling
this mismatch is critical to realizing the full
potential of the online classes that the students are
increasingly expecting.

Data were collected from students in an online
MBA program (Kim, Liu, & Bonk, 2005) through
semistructured, one-on-one interviews, surveys,
and in-person focus group interviews. It was deter-
mined that over 70% of those surveyed described
their online learning experience in a positive
manner, and about 93% of the respondents were
satisfied with the quality of their online courses. A
study that conducted one-on-one interviews with
fifteen experienced e-learning instructors (Bailey
& Card, 2009) identified eight effective pedagogical
practices for effective online teaching: fostering
relationships, engagement, timeliness, communic-

JOURNAL OF EDUCATORS ONLINE

ation, organization, tech-nology, flexibility, and
high expectations. The challenge of understanding
and integrating these eight facets of effective online
teaching was a possible reason for the hesitancy
within the ranks of the faculty to embrace online
teaching (Allen & Seaman, 2016).

Two key obstacles for effectively teaching an
online class were identified as meeting the student’s
core educational needs and maintaining a sense of
teaching presence (Carliner & Shank, 2016). To
meet students’ core needs, instructors must draw
on a variety of tools and strategies, which various
textbook publishers are increasingly offering.
Among them are MyLab by Pearson, MindTap
by Cengage, and Wiley Plus. Effective use of
these tools can bridge the gap between student
expectations and the hesitancy of faculty to meet
the core needs of students.

This paper explores and evaluates the Quiz
Me Mastery Points of Pearson MyOmLab and
determines whether this feature can bridge the gap
between faculty hesitation and student demand for
online offerings. We studied students’ performances
on tests and the Mastery Points they earned
through the Quiz Me feature and found that there
is a significant correlation between the two. First,
we present a comprehensive review of the current
literature that deals with various challenges faced
by online course offerings and what pedagogical
responses were likely to be successful. Then, in
the methodology of the study we investigate the
performance of 174 students over four semesters
(3,000 individual assessment scores). Next, we give
the results of the analysis and we identify factors
that improve or do not have an impact upon student
performance. Finally, we propose possible avenues
for future research.
LITERATURE REVIEW

In recent years, blended teaching and learning,
which includes online versus face-to-face, has
grown immensely; yet, the literature is not as
abundant as one would expect. Not only has
learning been under scrutiny, but some studies
have focused on other students’ and teachers’
viewpoints such as satisfaction, performance,
professor-student interaction, and a host of other
facets of teaching and learning. Smith and Bryant
(2009) observed the paucity of literature on teaching
case-based statistics classes and offer useful

tips for guiding online discussions. Dotterweich
and Rochelle (2012) also lamented the paucity
of research examining student characteristics
and factors leading to successful outcomes.
They studied three modes of delivery (online,
instructional television, and traditional classroom)
with three groups of students with similar GPAs
prior to taking their statistics courses. They found
online students were significantly older and more
likely to repeat the course and have earned more
credit hours prior to enrolling. They also found
that GPA and percentage of absences were highly
significant predictors of course performance. On
the suitability of online delivery for quantitative
business courses, specifically business statistics
and management science, research findings
suggest that features involving professor-student
interaction are the most useful, features promoting
student-student interaction are the least useful and
discussion forums are of limited value in learning
quantitative content (Sebastianelli & Tamimi,
2011). Katz and Yablon (2003) examined students’
academic performance in a required first-year
university internet-based Introduction to Statistics
course and the psychopedagogical variables
that contributed to students’ online learning
as compared to the learning of students who
participated in a traditional lecture-based course.
They found no difference in the performance
levels achieved by students of the two groups.
In addition, they found that participation in
the online course improved psychopedagogical
attitudes towards online learning despite the initial
misgivings of the participants in. A meta-analysis
of performance differences between online and
face-to-face undergraduate economics courses in
the United States (Sohn and Romal, 2015) found
statistically significant and stronger performances
for face-to-face instruction. Further, the study
found older/mature online instruction enrollees
performed better. Concerning satisfaction, a survey
of students of an online statistics course found
positive satisfaction with a mean of 4.00 in a five-
point Likert-scale (Al-Asfour, 2012). The study
demonstrated that students were satisfied with online
instructions, communications, and assessments.

On the question of students’ perceptions of
online homework assignments, a study of an
introductory finance class discovered that, in
general, students preferred online homework

JOURNAL OF EDUCATORS ONLINE

to traditional homework. The study further
determined that students found that the homework
assignments increased their understanding of the
material and graduate students reported a higher
level of satisfaction than did undergraduates
(Smolira, 2008). Law, Sek, Ng, Goh, & Tay (2012)
examined students’ perceptions of the use of the
Pearson’s online learning platform MyMathLab
as a supplementary tool in conducting assignment
and assessment in a mathematics course and found
that overall the students were satisfied with the use
of the MyMathLab platform.

Alrushiedat and Olfman (2013) conducted a
field experiment that explored the potential benefits
of asynchronous online discussions for business
statistics classes and found they facilitated more
and better-quality participation and engagement
for undergraduates.

Walstrom (2014) compared the performance
and satisfaction of over 220 students enrolled
in a traditional face-to-face class and over 300
students in an online class while migrating an
Electronic Business Management course from
a traditional face-to-face delivery to an online
delivery across a six-and-a-half-year period. The
comparison revealed that student performance
and satisfaction remained mostly consistent across
delivery methods.

Nicholson and Nicholson (2010) surveyed
student and faculty perceptions of using streaming
video for teaching students Microsoft Excel and
Access skills in an introductory management
information systems course. The results from
the survey showed that the use of a multimedia
component to convey course material provided
benefits to students in the form of greater satisfaction
with the learning process, a greater understanding
of the material, as well as a reduction in the effort
required to complete homework assignments. They
further reported that the instructors experienced
a marked reduction in visits from students who
required additional exposure to previously
covered material, a decrease in prep time during
subsequent semesters, and seamless portability to
online learning contexts.

Fuller and Bail (2011), using an action research
model, described the outcomes of an interactive
team-teaching model while teaching an online
graduate-level disaster research and statistics
course during a span of five semesters. They

reviewed instructor reflective logs and student
responses to the team-teaching model and found
that there was a positive benefit in developing
synergy in content and pedagogies, continued
instructor learning, and continuous reflection on
instructional design. They further found that the
immediacy of feedback and the added access and
clarity of the team-teaching process resulted in
students reporting a greater understanding of the
research and statistical process.

Hegeman (2015) examined whether student
performance in an online College Algebra course
could be improved if instructor-generated video
lectures were used instead of publisher-generated
educational resources. The study involved a College
Algebra course that used all the publisher-generated
educational resources and another course in which
students completed instructor-generated guided
note-taking sheets while watching instructor-
generated video lectures with publisher-generated
learning aids available as supplemental resources.
The results of this study showed that strategically
placing instructor-generated content improved
student performance significantly on both online
and handwritten assessments. The effectiveness
of the videoconferencing software Blackboard
Collaborate for carrying out instruction at
the college level to students attending classes
synchronously at multiple locations was evaluated
by Tonsmann (2014) and found to be an effective
method for educating students at a distance.

A multiple regression analysis used a dataset
that included over 5,000 courses taught by over
100 faculty members over a period of ten academic
terms at a large, public, four-year university
(Cavanaugh & Jacquemin, 2015). This study
revealed a statistical difference among course
formats that amounted to a negligible difference
of less than 0.07 GPA points on a four-point scale.
The authors further found an interaction between
course type and student GPA, indicating that
students with higher GPAs performed even better
in online courses. Alternatively, struggling students
performed worse when taking courses in an online
format compared to a face-to-face format.

Pena-Sanchez (2009) examined whether the
course delivery method, online or face-to-face,
and gender affected academic progress. Through
chi-square tests, it was found that the population
proportion of successful students in a course of

JOURNAL OF EDUCATORS ONLINE

Business Statistics did not depend on their gender
or the delivery mode of the class.

Wiechowski and Washburn (2014) studied more
than 3,000 end-of-semester course evaluations
collected from 171 finance and economics courses
in the 2010-2011 academic year. They reported
that the online and blended courses had a stronger
relationship with high course satisfaction than did
face-to-face courses. Further, they stated that there
was no significant relationship found among student
learning outcomes and the mode of course delivery.

Peng (2015) used an ordinary least squares
regression model to analyze a sample of 206 students
during the period from 2008 to 2012 and found that
significant predictors of student performance were
age, major, degree obtained, and the number of
hours a student worked but not the choice of a more
readable textbook.

Calafiore and Damianov (2011) used the online
tracking feature in Blackboard (Campus Edition)
to retrieve the real time that each student spent in
the course for the entire semester and to analyze the
impact of time spent online, prior grade point average
(GPA), and other demographic characteristics of
students on their final grades. They found that both
time and GPA were significant determinants of the
final grade.

Chen, Jones, and Moreland (2010) surveyed
students in online and traditional classroom sections
of an intermediate-level cost accounting course on
several items related to instruction and learning
outcomes. Then, they compared the student
examination performance in the two types of
sections. They found that both learning environments
generally had similar ratings. However, where there
was a difference, the satisfaction level of students in
the traditional classroom was higher. Furthermore,
they stated that the examination performance for 14
of 18 topic areas were similar with the traditional
method producing better comprehension in three of
the remaining four areas.
METHODOLOGY

The opportunities thrown open by the
increasing popularity of online courses comes
with difficult challenges. They include technical
challenges such as mastering software platforms
for content delivery, interacting with students,
online content delivery, participation, assessment,
learning style, time management, and motivation.

There are technical solutions for many of these
challenge and publishers offer learning platforms
for popular textbooks.

Quantitative courses present tough challenges
when they are offered online. Mastering
quantitative aspects of problem solving is critical.
Publisher online platforms have modules that
provide the opportunity for students to practice
and master concepts before taking tests. Pearson’s
MyOmLab platform includes several tools that can
be used for practice and learning concepts as well
as assessments. They include Practice, QuizMe,
Homework, Quiz, and Test.

As students work on each section of the
chapters of the textbook and achieve a minimum
score in a combination of assessment tools set by
the instructor, the students earn a Mastery Point.
In this study, three tools were used: Practice,
QuizMe, and Chapter tests. Students can learn
concepts and problem-solving skills by using the
practice tool, which allows students to seek help
from a variety of sources including reaching out
to the instructor. The QuizMe tool allows students
to self-test at the level of mastery achieved by
using the practice tool. In this study, we set the
minimum threshold of 80% in the QuizMe for
students to earn the Mastery Points associated
with the section. If a student failed to achieve
the minimum score, she or he could go back to
Practice and then retake the QuizMe until earning
the Mastery point. In as much as students can
seek help while using Practice and repeat QuizMe
unlimited times, Mastery Points earned had half
the weight of chapter tests that were similar in
content, but students could not receive any help
and had only two attempts with the higher of the
two grades recorded.

One of the research questions we faced was
whether this process of earning Mastery Points
with unlimited trials of Practice and QuizMe
was helping student performance as measured by
chapter tests. Further, we had both undergraduate
and graduate classes in the pool of classes for
which we gathered data (further described in
the next section). Therefore, we formulated the
following four hypotheses:
Hypothesis 1:

H0: The Mastery Score in a given chapter
does not have any effect on the test score in the
corresponding chapter.

JOURNAL OF EDUCATORS ONLINE

HA: The higher the Mastery Score in a given
chapter the higher the test score will be in the
corresponding chapter.
Hypothesis 2:

H0: The time spent earning Mastery Score in
a given chapter does not have any effect on the test
score in the corresponding chapter.

HA: The higher the time spent earning Mastery
Score in a given chapter the higher the test score
earned in the corresponding chapter.
Hypothesis 3:

H0: The average chapter test scores for graduate
students are the same as the corresponding average
for undergraduate students.

HA: The average chapter test scores for graduate
students are higher than the corresponding average
for undergraduate students.
Hypothesis 4:

H0: There is no interaction effect between
course level and Mastery Score earned on the
average chapter test scores.

HA: There is an interaction effect between
course level and Mastery Score earned on the
average chapter test scores.
THE DATA

We chose Operations Management at the
undergraduate level and Production and Operations
Management at the graduate level. While there were
significant differences in the range and coverage
of topics between the undergraduate and graduate
classes, we identified nine core chapters that were
common to both levels of classes. They are given
in Table 1.

Table 1. Chapters Common to OM and POM

Our study included 174 students over a period
of four semesters. For each of the 174 students,
data were collected on five variables for each of the
nine chapters listed in Table 1. These variables are
shown in Table 2. Note the Mastery Score recorded
was the percentage of total mastery points available
for the given chapter. Similarly, the test scores were
converted to a 100-point scale for consistency.
Table 2. Variables for the Nine Chapters

THE RESULTS
The summary of results is presented in Table

3. Figure 1 shows a scatter plot of average chapter
Mastery Score of individual students against their
respective average test score. The graduate student
scores are plotted with ● and the undergraduate
student scores are plotted with *. The scatter plot
shows a positive relationship between the level of
mastery achieved and test score. Further, there is
a clear separation of average scores between the
graduate and undergraduate students.
Table 3. Average Mastery and Test Scores

Chapter Description Mastery Points
1 Productivity 10

2 Project Management 10

3 Forecasting 7

4 Managing Quality 6

5 Statistical Process Control 3

6 Inventory Management 7

7 Aggregate Planning 7

8 Materials Requirement Planning 8

9 Scheduling 7

Variable Description Variable

Course level Graduate or Undergraduate Course level

Chapter Assessment chapter Chapter

Mastery Score Percentage of subsections of
the chapter mastered

Mastery Score

Mastery Time Time spent mastering the
chapter

Mastery Time

Test Score Test score (0–100) Test Score

Variable Description Variable
Course level Graduate or Undergraduate Course level
Chapter Assessment chapter Chapter
Mastery Score Percentage of subsections of
the chapter mastered
Mastery Score

Chapter
Graduate

Mastery Score
Undergraduate

Test Score
Productivity 98.46 94.44

Project Management 96.27 90.80

Forecasting 91.89 93.13

Managing Quality 98.83 93.64

Statistical Process Control 85.45 87.42

Inventory Management 90.56 81.49

Aggregate Planning 92.98 92.87

Materials Requirement Planning 94.08 87.17

JOURNAL OF EDUCATORS ONLINE

Figure 2 shows the chapterwise average of all
the student’s scores as a scatter plot. The positive
relationship between the level of mastery achieved
and the test score as well as the separation between
graduate and undergraduate students is evident in
this plot as well.

Finally, we ran a regression with individual
chapterwise test scores as the dependent variable
and Course level, Mastery Score, and Mastery
Time as the three independent variables. The
overall regression results are shown in Table 4, and
the results of individual and interaction effects are
shown in Table 5.

Table 4. Results of Overall Regression
Analysis of Variance

Source DF Sum of
Squares

Mean
Square

F
Value

Pr > F

Model 4 129783 32446 174.81 <.0001

Error 1427 264854 185.60

Corrected Total 1431 394637

Table 5.Results of Individual and Interaction Effects

The results in Table 4 show that the regression
between the chapter test scores as the dependent
variable and the three main effect variables and one
interaction effect variable as predictor is significant.
Table 5 shows some interesting results and based
on these results, three of the four hypotheses stated
earlier were rejected and one was not rejected.
These hypothesis test conclusions are discussed
below.

Hypothesis 1: Since the p-value for the
main effect Mastery Score earned was less than
0.0001, the null hypothesis was rejected, and
we concluded the Mastery Score earned was a
significant predictor of chapter test scores earned.
The estimated regression coefficient of 0.4456
indicated an estimated increase of almost ½ point
for the chapter test score earned for every additional
Mastery Score earned.

Hypothesis 2: Since the p-value for the main
effect time spent earning the Mastery Scores
was 0.3687, the null hypothesis was not rejected.
Therefore, we concluded there was no evidence
found in the data that the time spent earning
the Mastery Scores was a significant factor in
predicting the earned chapter test score.

Hypothesis 3: Since the p-value for the main
effect Course Level was less than 0.0001, the
null hypothesis was rejected, and we concluded
that the course level was a significant predictor
of the earned chapter test score. The estimated
regression coefficient of 25.8375 indicated that
the graduate students on average were expected

Figure 1. Average scores of individual students

Figure 2. Chapterwise average score

Variable
Intercept 1 42.8999 1.75552 24.44 <.0001

Mastery Score 1 0.4456 0.02043 21.33 <.0001

Time spent earning
Mastery Score

1 0.0021 0.00236 0.90 0.3687

Course level (Graduate) 1 25.8375 4.39389 5.88 <.0001

JOURNAL OF EDUCATORS ONLINE

to score a whopping 25 points more than the
undergraduate students.

Hypothesis 4: Since the p-value for the
interaction effect between Course Level Interaction
and Mastery Score earned was less than 0.0001, the
null hypothesis was rejected; and we concluded the
interaction between the course level and Mastery
Score earned was a significant predictor of chapter
test score earned. The estimated regression
coefficient value of -0.2092 revealed, interestingly,
that the undergraduate students benefited more
from earning more Mastery Scores than the
graduate students.
DISCUSSION

The result of the first hypothesis was expected.
The process of earning Mastery Score resulted
in students spending more time with practice
questions and, as a result, the students achieved
a better understanding of concepts tested in the
chapter tests, and, therefore, scored higher. This
result reveals a potential predictor of performance
in addition to GPA, as found by Dotterweich and
Rochelle (2012).

The result of the second hypothesis, namely
the time spent earning the Mastery Scores, was
not significant and may seem counter intuitive.
However, the time recorded by the MyOMLab
system is the duration of time the students were
connected to the online tool, which may not be the
same as the time actually spent working on earning
Mastery Scores. It may have included idle time
when students took a break or failed to log off after
completing the task. Therefore, it was very likely
that the time recorded by the system was not the
accurate measure of time students actually spent
working on earning the Mastery Scores, and this
inaccuracy may have contributed to the conclusion
that it was not a significant factor in estimating
chapter test score earned. In any case, even if the
length of time spent was not a significant factor
in determining the chapter test score earned, the
conclusion of Hypothesis 1 showed that earning
Mastery Score was a significant factor.

While the conclusion of Hypothesis 3, which
showed that graduate students scored higher on
chapter tests than undergraduate students, was
not surprising, the magnitude of the difference
was. The level of commitment and dedication of
a graduate student were likely reasons for this.

Another possible reason for this disparity was that
the online format for quantitative courses were more
suitable for graduate students than undergraduate
students. This needs further research to confirm.

The conclusion of Hypothesis 4 revealed an
intriguing insight. The regression slope coefficient
for the interaction term between course level (1
= Graduate, 0 = Undergraduate) and Mastery
Score earned showed that the higher the Mastery
Score the undergraduates had, the higher their test
scores were relative to the graduate students (all
other factors held constant). If one of the reasons
we speculated for the conclusion of Hypothesis 3,
namely, that quantitative courses in online format
were less suitable for undergraduate students than
graduate students was true, then one possible
mitigation strategy was to encourage students to
earn more Mastery Scores. This may be achieved
by setting a higher standard for earning the Mastery
Score than the 80% we used or by using higher
weightage for Mastery Scores in the final grade.
CONCLUSION

Over the last decade, online course delivery has
seen a remarkable growth. Studies show that the
demand for online offerings will continue to grow.
While the flexibility they offer is very attractive
to students, instructors are hesitant to embrace
them fully. One of the factors for this hesitancy
is the uncertainty with respect to which strategies
and tools instructors can use that will benefit
students and help them to succeed in mastering
course objectives. This study demonstrated the
practicality of tools such as Practice and Quiz Me
of the Pearson MyLab platform. In addition, such
publisher-developed online tools can help bridge
the gap in performance between online and face-
to-face undergraduate economics courses in the
United States that Sohn and Romal (2015) found.
We found that the time spent earning the Mastery
Scores was not a significant factor for students to
improve their chapter test scores; however, it was
very likely that MyOmLab overstated the time
spent by students earning the Mastery Scores. The
time reported by MyOmLab was the duration of
time a student was connected to the system, which
may have been longer than the actual time spent
working because students may take breaks or, in
some instances, students may not terminate the
session after completing their work. Therefore,

JOURNAL OF EDUCATORS ONLINE

we cannot be certain that time spent was not a
significant factor. Finally, in this study, the students
were not required to earn the Mastery Score before
attempting the corresponding chapter test, though
most students do. Whether requiring students
to earn the Mastery Score before attempting the
corresponding chapter test will have a significant
effect on student performance in chapter tests is a
topic for future research.

JOURNAL OF EDUCATORS ONLINE

REFERENCES
Al-Asfour, A. (2012). Examining student satisfaction of online

statistics courses. Journal of College Teaching & Learning
(Online), 9(1), 33–38.

Allen, I. E., & Seaman, J. (2016). Online Report Card: Tracking
Online Education in the United States. Babson Park, MA:
Babson Survey Research Group.

Alrushiedat, N., & Olfman, L. (2013). Aiding participation and
engagement in a blended learning environment. Journal of
Information Systems Education, 24(2), 133–145.

Bailey, C. J., & Card, K. A. (2009). Effective pedagogical practices
for online teaching: Perception of experienced instructors.
The Internet and Higher Education, 12(3-4), 152–155.
doi:10.1016/j.iheduc.2009.08.002

Calafiore, P., & Damianov, D. S. (2011). The effect of time spent
online on student achievement in online economics and
finance courses. The Journal of Economic Education, 42(3),
209–223. doi:10.1080/00220485.2011.581934

Carliner, S., & Shank, P. (Eds.). (2016). The e-learning handbook:
past promises, present challenges. New York, NY: John Wiley
& Sons.

Cavanaugh, J. K., & Jacquemin, S. J. (2015). A large sample
comparison of grade based student learning outcomes in
online vs. face-to-face courses. Online Learning, 19(2), n2.

Chen, C. C., Jones, K. T., & Moreland, K. (2010). Distance
education in a cost accounting course: Instruction, interaction,
and multiple measures of learning outcomes. Journal of
Educators Online, 7(2). doi:10.9743/JEO.2010.2.3

Dotterweich, D. P., & Rochelle, C. F. (2012). Online, instructional
television and traditional delivery: Student characteristics and
success factors in business statistics. American Journal of
Business Education, 5(2), 129–138.

Fuller, R. G., & Bail, J. (2011). Team teaching in the online graduate
environment: Collaborative instruction. International Journal
of Information and Communication Technology Education
(IJICTE), 7(4), 72–83. doi:10.4018/jicte.2011100107

Hegeman, J. S. (2015). Using instructor-generated video lectures
in online mathematics courses improves student learning.
Online Learning, 19(3), 70–87.

Katz, Y. J., & Yablon, Y. B. (2003). Online university
learning: cognitive and affective perspectives.
Campus-Wide Information Systems, 20(2), 48–54.
doi:10.1108/10650740310467745

Kim, K. J., Liu, S., & Bonk, C. J. (2005). Online MBA students’
perceptions of online learning: Benefits, challenges, and
suggestions. The Internet and Higher Education, 8(4),
335–344. doi:10.1016/j.iheduc.2005.09.005

Law, C. Y., Sek, Y. W., Ng, L. N., Goh, W. W., & Tay, C. L. (2012).
Students’ perceptions of MyMathLab as an online learning
tool. International Journal of e-Education, e-Management,
and e-Learning, 2(1), 22–27.

Nicholson, J., & Nicholson, D. B. (2010). A stream runs through
IT: Using streaming video to teach information technology.
Campus-Wide Information Systems, 27(1), 17–24.
doi:10.1108/10650741011011255

Pena-Sanchez, R. (2009). An online course of business statistics:
The proportion of successful students. American Journal of
Business Education, 2(6), 23–30. doi:10.19030/ajbe.v2i6.4084

Peng, C. C. (2015). Textbook readability and student performance
in online introductory corporate finance classes. Journal of
Educators Online, 12(2), 35–49.

Sebastianelli, R., & Tamimi, N. (2011). Business statistics and
management science online: Teaching strategies and
assessment of student learning. Journal of Education for
Business, 86(6), 317–325. doi:10.1080/08832323.2010.5255
45

Smith, M. A., & Bryant, P. G. (2009). Managing case discussions in
introductory business statistics classes: Practical approaches
for instructors. The American Statistician, 63(4), 348–355.
doi:10.1198/tast.2009.09053

Smolira, J. C. (2008). Student perceptions of online homework
in introductory finance courses. Journal of Education for
Business, 84(2), 90–95. doi:10.3200/JOEB.84.2.90-95

Sohn, K., & Romal, J. B. (2015). Meta-analysis of student
performance in micro and macro economics: Online vs. face-
to-face instruction. Journal of Applied Business & Economics,
17(2), 42–51.

Tonsmann, G. (2014). A study of the effectiveness of Blackboard
Collaborate for conducting synchronous courses at multiple
locations. InSight: A Journal of Scholarly Teaching, 9, 54–63.

Walstrom, K. A. (2014). Lessons learned from migrating to an
online electronic business management course. Journal of
Information Systems Education, 25(2), 137–147.

Wiechowski, L., & Washburn, T. L. (2014). Online finance
and economics courses: A comparative study of course
satisfaction and outcomes across learning models. American
Journal of Business Education, 7(1), 37–48

© Journal of Educators Online. An Open Access Journal.

Calculate your order
Pages (275 words)
Standard price: $0.00
Client Reviews
4.9
Sitejabber
4.6
Trustpilot
4.8
Our Guarantees
100% Confidentiality
Information about customers is confidential and never disclosed to third parties.
Original Writing
We complete all papers from scratch. You can get a plagiarism report.
Timely Delivery
No missed deadlines – 97% of assignments are completed in time.
Money Back
If you're confident that a writer didn't follow your order details, ask for a refund.

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
$0.00
Power up Your Academic Success with the
Team of Professionals. We’ve Got Your Back.
Power up Your Study Success with Experts We’ve Got Your Back.

Order your essay today and save 30% with the discount code ESSAYHELP