Experimenting and Testing (WEB ANALYTICS)

 Please note, points will be deducted for overuse of citations.  

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

  please feel free to explore your own research for eBooks, journals and articles that best answer the topics and assignments.  Do your best to locate resources that are no more than three (3) years old. PLEASE DO NOT USE WIKIPEDIA AS A RESOURCE AND OR REFERENCE. 

Please submit a Word Doc with your answers in 500 words or more, and use APA references in support of your answers.

Some resources are attached for help, you may use that for help as well. 

 Explain in your paper how to build a successful testing program, demonstrate the importance of understanding user interactions. 

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

 Explain the importance of setting goals before you test.  Explain how the web analyst should be aware of visitor site usability. 

 Include why a hypothesis is so important. Explain the importance of testing and settings goals and the hypothesis. 

https://support.google.com/analytics/answer/9366791?hl=en&ref_topic=1745207&rd=2&visit_id=637345207336525335-2409299199

AgileAnalytics a New Buzz Word: What in the World is Agile Analytics?

How does it apply to Web Analytics?

By: Chris Preimesberger 2013

Agile Analytics: What It Is, and 10 Best Practices for Using It

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

In the world of web analytics and the collection of visitor and customer data, this data can become

voluminous. Of course this data must be stored in a data warehouse for retrieval at some point.

Enterprises of all sizes are now realizing they have dormant data in silos that they can put to work for

them. For example, this so-called big data, if analyzed correctly, can help project sales spikes, predict

raw material needs and help companies understand their customers better, and this is just a sampling of

what it can do. However, too often, enterprises find themselves ill-equipped to gather, cleanse and

analyze this data, and therefore unable to act upon potential insights or gain competitive advantages.

Agile big data analytics focuses not on the data itself but on the insight and action that can ultimately be

drawn from nimble business intelligence systems. Rather than beginning with investment and platform

building, Agile analytics starts with learning and testing, so that companies can build their models and

strategies based on solid answers to their most crucial business questions. The sources for this slide

show include big data consultancy ThoughtWorks and eWEEK reporting.

Collaborate Across Business Communities

Data warehousing and business intelligence systems will live in a diversity of environments, not

just in the IT department. It’s important to treat business owners, technical experts, project

managers and the many communities of users across the organization as members of the team by

allowing them to offer input and test working features as they’re developed.

Educate Stakeholders

Because most stakeholders aren’t well-versed in data warehousing and business intelligence, they

don’t know what’s reasonable for them to ask or expect, and often they change their minds as

they see the system put into action. Investing time in education both up-front and throughout

development will help clarify needs and goals, keeping the developed product useful and

relevant.

Continuously Deliver Working Features

In traditional development models, developers could work for months on a feature, only to find it

no longer applicable to a changing business environment. In Agile analytics, each iteration

Agile Analytics: What It Is, and 10 Best Practices for Using It

http://www.thoughtworks.com/big-data-analytics

should deliver a working feature to be tested by stakeholders and adapted in further iterations to

better suit the organization’s needs.

Test Frequently

With so many stakeholders on board, it is crucial to test data warehousing/business intelligence

systems frequently throughout the development process. Integrate continuously and test systems

in pre-production or demo environments at various benchmarks throughout the project so there

are no surprises at the end.

Adapt to Changing Conditions

The core purpose of big data is to find key insights upon which an organization can pivot. In this

way, big data by definition demands agility. Listen to what users, tests and business conditions

are telling you, and work change into subsequent iterations.

Automate as Many Processes as Possible

The greatest manpower should be saved for developing new features and collaborating across

organizational and development teams. As such, it’s important to automate as many regular

processes as possible, from testing to administrative tasks so that developers can focus intensely

on an iteration’s set goals.

Foster Self-Organized Teams

Hire talented, motivated individuals who can set their own goals for each iteration and function

as effective self-managers. Then, trust them to do the job at hand, self-monitoring and adapting

as they go.

Adapt Agile Methods to Individual Projects and Teams

While Agile analytics has many guidelines, it is a style, not a process. More traditional tactics

aren’t antithetical to Agile if they’re effective in achieving iterative goals. Choose the tactics that

work best for each project and team rather than adhering to static rules.

Conduct Regular Reviews of Processes

Agile systems development requires just as much discipline and rigor as the traditional waterfall

method in order to stay on track. However, rigor should be applied not to adhering to rigid

systems and static goals, but to constantly re-evaluating the effectiveness of the methods and

styles at hand.

Constantly Learn

Keep up-to-date with the best data warehousing and business intelligence practices and

implement them fluidly into each iterative phase. This will substantially increase the

development team’s agility and keep the organization ahead of its competitors.

Praise for Agile Analytics

“This book does a great job of explaining why and how you would imple-
ment Agile Analytics in the real world. Ken has many lessons learned from
actually implementing and refining this approach. Business Intelligence is
definitely an area that can benefit from this type of discipline.”

—Dale Zinkgraf, Sr. Business Intelligence Architect

“One remarkable aspect of Agile Analytics is the breadth of coverage—from
product and backlog management to Agile project management techniques,
from self-organizing teams to evolutionary design practices, from auto-
mated testing to build management and continuous integration. Even if you
are not on an analytics project, Ken’s treatment of this broad range of topics
related to products with a substantial data-oriented flavor will be useful for
and beyond the analytics community.”

— Jim Highsmith, Executive Consultant, ThoughtWorks, Inc., and author of Agile
Project Management

“Agile methods have transformed software development, and now it’s time
to transform the analytics space. Agile Analytics provides the knowledge
needed to make the transformation to Agile methods in delivering your
next analytics projects.”

— Pramod Sadalage, coauthor of Refactoring Databases: Evolutionary Database
Design

“This book captures the fundamental strategies for successful business
intelligence/analytics projects for the coming decade. Ken Collier has raised
the bar for analytics practitioners—are you up to the challenge?”

— Scott Ambler, Chief Methodologist for Agile and Lean, IBM Rational Founder,
Agile Data Method

“A sweeping presentation of the fundamentals that will empower teams to
deliver high-quality, high-value, working business intelligence systems far
more quickly and cost effectively than traditional software development
methods.”

—Ralph Hughes, author of Agile Data Warehousing

This page intentionally left blank

AGILE ANALYTICS

Agile software development centers on four values, which are identified in the Agile Alliance’s Manifesto*:
1. Individuals and interactions over processes and tools
2. Working software over comprehensive documentation
3. Customer collaboration over contract negotiation
4. Responding to change over following a plan

The development of Agile software requires innovation and responsiveness, based on
generating and sharing knowledge within a development team and with the customer.
Agile software developers draw on the strengths of customers, users, and developers
to find just enough process to balance quality and agility.

The books in The Agile Software Development Series focus on sharing the experiences
of such Agile developers. Individual books address individual techniques (such as Use
Cases), group techniques (such as collaborative decision making), and proven solutions
to different problems from a variety of organizational cultures. The result is a core of
Agile best practices that will enrich your experiences and improve your work.

* © 2001, Authors of the Agile Manifesto

Visit informit.com/agileseries for a complete list of available publications.

The Agile Software Development Series
Alistair Cockburn and Jim Highsmith, Series Editors

AGILE ANALYTICS

A VALUE-DRIVEN APPROACH TO BUSINESS
INTELLIGENCE AND DATA WAREHOUSING

KEN COLLIER

Upper Saddle River, NJ • Boston • Indianapolis • San Francisco
New York • Toronto • Montreal • London • Munich • Paris • Madrid
Capetown • Sydney • Tokyo • Singapore • Mexico City

Many of the designations used by manufacturers and sellers to distinguish their products
are claimed as trademarks. Where those designations appear in this book, and the publisher
was aware of a trademark claim, the designations have been printed with initial capital let-
ters or in all capitals.

The author and publisher have taken care in the preparation of this book, but make no
expressed or implied warranty of any kind and assume no responsibility for errors or omis-
sions. No liability is assumed for incidental or consequential damages in connection with or
arising out of the use of the information or programs contained herein.

The publisher offers excellent discounts on this book when ordered in quantity for bulk
purchases or special sales, which may include electronic versions and/or custom covers and
content particular to your business, training goals, marketing focus, and branding interests.
For more information, please contact:

U.S. Corporate and Government Sales
(800) 382-3419
corpsales@pearsontechgroup.com

For sales outside the United States please contact:

International Sales
international@pearson.com

Visit us on the Web: informit.com/aw

Library of Congress Cataloging-in-Publication Data

Collier, Ken, 1960–
Agile analytics : a value-driven approach to business intelligence and
data warehousing / Ken Collier.
p. cm.
Includes bibliographical references and index.
ISBN 978-0-321-50481-4 (pbk. : alk. paper)
1. Business intelligence—Data processing. 2. Business
intelligence—Computer programs. 3. Data warehousing. 4. Agile
software development. 5. Management information systems. I. Title.
HD38.7.C645 2012
658.4’72—dc2

3

2011019825

Copyright © 2012 Pearson Education, Inc.

All rights reserved. Printed in the United States of America. This publication is protected
by copyright, and permission must be obtained from the publisher prior to any prohibited
reproduction, storage in a retrieval system, or transmission in any form or by any means,
electronic, mechanical, photocopying, recording, or likewise. For information regarding
permissions, write to:

Pearson Education, Inc.
Rights and Contracts Department
501 Boylston Street, Suite 900
Boston, MA 02116
Fax: (617) 671-3447

ISBN-13: 978-0-321-50481-4
ISBN-10: 0-321-50481-X
Text printed in the United States on recycled paper at RR Donnelley in Crawfordsville,
Indiana.
First printing, July 2011

This book is dedicated to my wife and best friend, Beth,
who never once asked, “How come it’s taking you so

long to finish that darn book?”

This page intentionally left blank

ix

CONTENTS

Foreword by Jim Highsmith

xv

Foreword by Wayne Eckerson

xvii

Preface

xix

Acknowledgments

xxxiii

About the Author

xxxv

Part I Agile Analytics: Management Methods 1

Chapter 1 Introducing Agile Analytics 3
Alpine-Style Systems Development 4
What Is Agile Analytics? 7

Here’s What Agile Analytics Is 7
Guiding Principles 9
Myths and Misconceptions 10

Data Warehousing Architectures and Skill Sets 13
Data Warehousing Conceptual Architectures 13
Diverse and Disparate Technical Skills 15

Why Do We Need Agile Analytics? 16
First Truth: Building DW/BI Systems Is Hard 16
Second Truth: DW/BI Development Projects Fail Often 17
Third Truth: It Is Best to Fail Fast and Adapt 18
Is Agile Really Better? 19
The Difficulties of Agile Analytics 20

Introducing FlixBuster Analytics 22
Wrap-Up 23

Chapter 2 Agile Project Management 25
What Is Agile Project Management? 26
Phased-Sequential DW/BI Development 30

x CONTENTS

Envision Explore Instead of Plan Do 32
Envision Phase 32
Explore Phase 33

Changing the Role of Project Management 35
Making Sense of Agile “Flavors” 36
Tenets of Agility 39

Just Enough Design 39
Synchronize Daily 41
Timebox Everything 42
Colocating Teams 44
Attention to Technical Debt 45
Plan to Capacity and Monitor Velocity 46
Track Daily Progress 49
Monitor Story Completion, Not Task Time 54

Wrap-Up 56

Chapter 3 Community, Customers, and Collaboration 59
What Are Agile Community and Collaboration? 60
The Agile Community 64
A Continuum of Trust 67
The Mechanics of Collaboration 69
Consumer Collaboration 73
Doer Collaboration 77
Planner Collaboration 78
Precursors to Agility 80
Wrap-Up 82

Chapter 4 User Stories for BI Systems 85
What Are User Stories? 86
User Stories versus Requirements 89
From Roles to Use Cases to User Stories 92

User Roles 93
Use-Case Modeling 96
Finding User Stories in Event Flows 98
Use-Case Scenarios 98

Decomposing Epics 99
What’s the Smallest, Simplest Thing? 103
Story Prioritization and Backlog Management 107

Value-Based Prioritization 108
Capability-Based Prioritization 109

CONTENTS xi

Prioritization Process 110
Backlog Management 111

Story-Point Estimating 111
Parking Lot Diagrams 117
Wrap-Up 119

Chapter 5 Self-Organizing Teams Boost Performance 121
What Is a Self-Organizing Team? 122
Self-Organization Requires Self-Discipline 127
Self-Organization Requires Shared Responsibility 128
Self-Organization Requires Team Working Agreements 130
Self-Organization Requires Honoring Commitments 132

Watch Out for Hangovers 133
Self-Organization Requires Glass-House Development 134
Self-Organizing Requires Corporate Alignment 136
Wrap-Up 137

Part II Agile Analytics: Technical Methods 139

Chapter 6 Evolving Excellent Design 141
What Is Evolutionary Design? 144
How Much Up-Front Design? 148
Agile Modeling 149
Data Model Patterns 152
Managing Technical Debt 154
Refactoring 157

What Is Refactoring? 159
When to Refactor 162
How to Refactor 165
Final Words on Refactoring 167

Deploying Warehouse Changes 167
Blue-Green Deployment 169
Database Versioning 170

Other Reasons to Take an Evolutionary Approach 171
Case Study: Adaptive Warehouse Architecture 174

Product Evolution 175
Architectural Overview 177
Observation Message Model 179

Wrap-Up 189

xii CONTENTS

Chapter 7 Test-Driven Data Warehouse Development 193
What Is Agile Analytics Testing? 194
Agile Testing Framework 197

What about Performance, Load, and Stress Testing? 200
BI Test Automation 201

BI Testing Process 203
Database Testing Tools 205
What to Test? 209

Sandbox Development 211
Test-First BI Development 215

Unit-Test-Driven Development 215
Storytest-Driven DW/BI Development 218
Generating Storytests 219

BI Testing Guidelines 220
Setup Time 221
Functional BI Testing 222
Wrap-Up 223

Chapter 8 Version Control for Data Warehousing 225
What Is Version Control? 226
The Repository 230

What to Store? 230
What Not to Store? 232

Working with Files 233
What Are Versions? 235
Tags, Branches, and Merging 236
Resolving Conflicts 238

Organizing the Repository 240
Explanatory Files 241
Directories 241

Tagging and Branching 245
When to Tag and Branch 245
Naming Tags and Branches 248
Keeping Things Simple 251

Choosing an Effective Tool 252
Wrap-Up 254

Chapter 9 Project Automation 257
What Is Project Automation? 258
Getting Started 261

CONTENTS xiii

Build Automation 262
Rudimentary Automated Build 264
More Advanced Automated Build 267
When to Get Started 274

Continuous Integration 274
Build Frequency 275
Scheduled Builds 276
Triggered Builds 277
Setting Up Continuous Integration 277

Push-Button Releases 281
What Is a Release? 282
Preparing a Release 282
Bundle the Release 283

Wrap-Up 288

Chapter 10 Final Words 291
Focus on the Real Problem 291
Being Agile versus Doing Agile 293
Gnarly Problems 296
What about Emerging Technologies? 298
Adoption Strategies 299

Expect Some Chaos 300
Leadership Responsibilities 302
Goals and Business Alignment 302
Agile Adoption Road-Mapping 303
Training and Coaching 303
Measuring Success 305

Closing Thoughts . . . 306

References and Recommended Reading 309

Index 315

This page intentionally left blank

xv

FOREWORD
BY JIM HIGHSMITH

I was introduced to Ken Collier through a mutual friend about seven years
ago. We started meeting for coffee (a two-person Agile group in Flagstaff,
Arizona) every week or so to talk about software development, a sprinkling
of Agile here and there, skiing, mountain biking, and Ken’s analytics proj-
ects. Early on, as Ken talked about a project that was faltering and I talked
about Agile, he decided to try out Agile on his next project. As he quipped,
“It couldn’t be worse!”

Over the years I’ve heard every reason imaginable why “Agile won’t work
in my company because we are different.” Ken never had that attitude and
from the beginning kept trying to figure out not if Agile would work on
business intelligence and data warehousing projects, but how it would work.
Ken saw each impediment as an opportunity to figure out an Agile way to
overcome it. From developing user stories that traversed the entire analyt-
ics software stack, to figuring out how to do continuous integration in that
same diverse stack, Ken has always been Agile, just as he was learning to
do Agile. Today, Ken champions the cause of being Agile and not just doing
Agile.

Over subsequent analytics projects, one that ran for over three years, deliv-
ering releases every quarter, Ken took the fundamental Agile management
and development practices and came up with innovative ways to apply them.
Business intelligence and data warehousing developers have been reluctant
to embrace Agile (although that is changing) in part because it wasn’t clear
how to apply Agile to these large, data-centric projects. However, analytics
projects suffered from the same problems as more typical IT projects—they
took too long, cost too much, and didn’t satisfy their customers. In our cur-
rent turbulent business era these kinds of results are no longer acceptable.

One remarkable aspect of Agile Analytics is the breadth of coverage—from
product and backlog management, to Agile project management techniques,
to self-organizing teams, to evolutionary design practices, to automated
testing, to build management and continuous integration. Even if you are
not on an analytics project, Ken’s treatment of this broad range of topics
related to products with a substantial data-oriented flavor will be useful for
and beyond the analytics community.

xvi FOREWORD BY JIM HIGHSMITH

In each subject area he has taken the basic Agile practices and custom-
ized them to analytics projects. For example, many BI and data warehouse
teams are far behind their software development counterparts in configura-
tion management. With execution code in Java, Ruby, and other languages,
stored procedures, SQL, and tool-specific code in specialized tools, analyt-
ics teams often have poor “code” management practices. Ken spends several
chapters on reviewing techniques that software developers have been using
and showing how those techniques can be adapted to an analytics envi-
ronment. Ken often asks analytics teams, “If your servers went down hard
today, how long would it take you to rebuild?” The responses he typically
receives vary from a few weeks to never! The automation of the build, inte-
gration, and test process is foreign to many analytics teams, so Ken spends
a chapter each on version control and build automation, showing how to
build a fast-paced continuous integration environment.

The book also devotes a chapter to explaining how to customize test-driven
development (TDD) to an analytics environment. Comprehensive, auto-
mated testing—from unit to acceptance—is a critical piece of Agile devel-
opment and a requirement for complete continuous integration.

The breadth of Ken’s topic coverage extends to architecture. While he advo-
cates architecture evolution (and evolutionary design is covered in Chapter 6,
“Evolving Excellent Design”), he describes architectural patterns that are
adaptive. In Chapter 6 he introduces an adaptable analytics architecture,
one that he used on a large project in which change over time was a key part
of the challenge. This architecture advocates a “data pull” in contrast to the
traditional “data push” approach, much like Kanban systems.

What I like about Ken’s book can be summarized by three points: (1) It
applies Agile principles and practices to analytics projects; (2) it addresses
technical and management practices (doing Agile) and core Agile principles
(being Agile); and (3) it covers an astonishingly wide range of topics—from
architecture to build management—yet it’s not at all superficial. This is
quite an accomplishment. Anyone participating in data-centric or business
analytics projects will benefit from this superb book.

—Jim Highsmith
Executive Consultant
Thoughtworks, Inc.

xvii

FOREWORD
BY WAYNE ECKERSON

Several years ago, I spearheaded the development of Web sites for The Data
Warehousing Institute’s local chapters. I had established the program two
years earlier and worked closely with many of the officers to grow the chap-
ters and host events.

As the “business driver” of the project, I knew exactly what functionality
the chapter Web sites needed. I had researched registration and collabora-
tion systems and mapped their capabilities to my feature matrix. I was ready
to wheel and deal and get a new system up and running in three months.

Unfortunately, the project went “corporate.” The president assigned some-
one to manage the project, an IT person to collect requirements, and a
marketing person to coordinate integration with our existing Web site. We
established a regular time to meet and discuss solutions. In short order, the
project died.

My first sense of impending doom came when I read the requirements doc-
ument compiled by the IT developer after I had e-mailed her my require-
ments and had a short conversation. When I read the document—and I’m
technically astute—I no longer recognized my project. I knew that anyone
working from the document (i.e., vendor or developer) would never get
close to achieving the vision for the Web sites that I felt we needed.

This experience made me realize how frustrated business people get with
IT’s traditional approach to software development. Because I witnessed
how IT translates business requirements into IT-speak, I now had a greater
understanding of why so many business intelligence (BI) projects fail.

Agile to the rescue. When I first read about Agile development techniques,
I rejoiced. Someone with a tad of business (and common) sense had finally
infiltrated the IT community. Everything about the methodology made
perfect sense. Most important, it shifts the power in a development project
from the IT team to business users for whom the solution is being built!

However, the Agile development methodology was conceived to facili-
tate software projects for classic transaction-processing applications.

xviii FOREWORD BY WAYNE ECKERSON

Unfortunately, it didn’t anticipate architecture- and data-laden develop-
ment projects germane to business intelligence.

Fortunately, BI practitioners like Ken Collier have pioneered new territory
by applying Agile methods to BI and have lived to tell about their experi-
ences. Ken’s book is a fount of practical knowledge gleaned from real project
work that shows the dos and don’ts of applying Agile methods to BI.

Although the book contains a wealth of process knowledge, it’s not a how-
to manual; it’s really more of a rich narrative that gives would-be Agile BI
practitioners the look, feel, smell, and taste of what it’s like to apply Agile
methods in a real-world BI environment. After you finish reading the book,
you will feel as if you have worked side by side with Ken on a project and
learned from the master.

—Wayne Eckerson
Founder and President
BI Leadership Forum
Formerly Director of Research and Services, TDWI

xix

PREFACE

WHEN DW/BI PROJECTS GO BAD
Most data warehouse developers have experienced projects that were less
than successful. You may even have experienced the pain of a failed or fail-
ing project. Several years ago I worked for a midsize company that was seek-
ing to replace its existing homegrown reporting application with a properly
architected data warehouse. My role on the project was chief architect and
technical lead. This project ended very badly and our solution was ulti-
mately abandoned. At the outset the project appeared poised for success and
user satisfaction. However, in spite of the best efforts of developers, project
managers, and stakeholders, the project ran over budget and over schedule,
and the users were less than thrilled with the outcome. Since this project
largely motivated my adaptation of Agile principles and practices to data
warehouse and business intelligence (DW/BI) development, I offer this brief
retrospective to help provide a rationale for the Agile DW/BI principles and
practices presented throughout this book. It may have some similarities to
projects that you’ve worked on.

About the Project

This section summarizes the essential characteristics of the project, includ-
ing the following:

� Existing application. The company’s existing reporting application
was internally referred to as a “data warehouse,” which significantly
skewed users’ understanding of what a data warehouse applica-
tion offers. In reality the data model was a replication of parts of
one of the legacy operational databases. This replicated database
did not include any data scrubbing and was wrapped in a signifi-
cant amount of custom Java code to produce the reports required.
Users had, at various times, requested new custom reports, and the
application had become overburdened with highly specialized and
seldom used reporting features. All of the reports could be classi-
fied as canned reports. The system was not optimized for analytical
activities, and advanced analytical capabilities were not provided.

xx PREFACE

� Project motivation. Because the existing “data warehouse” was
not architected according to data warehousing best practices, it
had reached the practical limits of maintainability and scalability
needed to continue meeting user requirements. Additionally, a new
billing system was coming online, and it was evident that the exist-
ing system could not easily be adapted to accommodate the new
data. Therefore, there was strong executive support for a properly
designed data warehouse.

� External drivers. The data warehousing project was initially envi-
sioned by a sales team from one of the leading worldwide vendors of
data warehousing and business intelligence software. In providing
guidance and presales support, this sales team helped the project
sponsors understand the value of eliciting the help of experienced
business intelligence consultants with knowledge of industry best
practices. However, as happens with many sales efforts, initial esti-
mates of project scope, cost, and schedule were overly ambitious.

� Development team. The development team consisted exclusively of
external data warehousing contractors. Because the company’s exist-
ing IT staff had other high-priority responsibilities, there were no
developers with deep knowledge of the business or existing opera-
tional systems. However, the development team had open access to
both business and technical experts within the company as well as
technology experts from the software vendor. While initial discov-
ery efforts were challenging, there was strong participation from all
stakeholders.

� Customer. The primary “customer” for the new data warehouse was
the company’s finance department, and the project was sponsored
by the chief financial officer. They had a relatively focused busi-
ness goal of gaining more reliable access to revenue and profitability
information. They also had a substantial volume of existing reports
used in business analysis on a routine basis, offering a reasonable
basis for requirements analysis.

� Project management. Project management (PM) responsibilities
were handled by corporate IT using traditional Project Management
Institute/Project Management Body of Knowledge (PMBOK) prac-
tices. The IT group was simultaneously involved in two other large
development projects, both of which had direct or indirect impact
on the data warehouse scope.

� Hosted environment. Because of limited resources and infrastruc-
ture, the company’s IT leadership had recently decided to partner
with an application service provider (ASP) to provide hosting ser-
vices for newly developed production systems. The data warehouse

PREFACE xxi

was expected to reside at the hosting facility, located on the west
coast of the United States, while the company’s headquarters were
on the east coast. While not insurmountable, this geographic sepa-
ration did have implications for the movement of large volumes of
data since operational systems remained on the east coast, residing
on the corporate IT infrastructure.

Project Outcome

The original project plan called for an initial data warehouse launch within
three months but had an overly ambitious scope for this release cycle. Proj-
ect completion was a full eight months after project start, five months late!
User acceptance testing did not go well. Users were already annoyed with
project delays, and when they finally saw the promised features, there was
a large gap between what they expected and what was delivered. As is com-
mon with late projects, people were added to the development team during
the effort to try to get back on track. As Fred Brooks says, “Adding more
people to a late project only makes it later” (Brooks 1975). Ultimately, proj-
ect costs far exceeded the budget, users were unsatisfied, and the project was
placed on hold until further planning could be done to justify continued
development.

Retrospective

So who was to blame? Everybody! Users felt that the developers had missed
the mark and didn’t implement all of their requirements. Developers felt that
the users’ expectations were not properly managed, and the project scope
grew out of control. Project sponsors felt that the vendors overpromised and
underdelivered. Vendors felt that internal politics and organizational issues
were to blame. Finally, many of the organization’s IT staff felt threatened by
lack of ownership and secretly celebrated the failure.

The project degenerated into a series of meetings to review contracts and
project documents to see who should be held responsible, and guess what?
Everyone involved was partially to blame. In addition to the normal techni-
cal challenges of data warehouse development, the following were identified
as root causes of project failure:

� The contract did not sufficiently balance scope, schedule, and
resources.

� Requirements were incomplete, vague, and open-ended.
� There were conflicting interpretations of the previously approved

requirements and design documents.

xxii PREFACE

� Developers put in long nights and weekends in chaotic attempts to
respond to user changes and new demands.

� The technical team was afraid to publicize early warning signs
of impending failure and continued trying to honor unrealistic
commitments.

� Developers did not fully understand the users’ requirements or
expectations, and they did not manage requirements changes well.

� Users had significant misconceptions about the purpose of a data
warehouse since existing knowledge was based on the previous
reporting application (which was not a good model of a warehouse).

� Vendors made ambitious promises that the developers could not
deliver on in the time available.

� The project manager did not manage user expectations.
� IT staff withheld important information from developers.
� The ASP partner did not provide the level of connectivity and tech-

nical support the developers expected.

Hindsight truly is 20/20, and in the waning days of this project several things
became apparent: A higher degree of interaction among developers, users,
stakeholders, and internal IT experts would have ensured accurate under-
standing on the part of all participants. Early and frequent working software,
no matter how simplistic, would have greatly reduced the users’ misconcep-
tions and increased the accuracy of their expectations. Greater emphasis on
user collaboration would have helped to avoid conflicting interpretations
of requirements. A project plan that focused on adapting to changes rather
than meeting a set of “frozen” contractual requirements would have greatly
improved user satisfaction with the end product. In the end, and regardless
of blame, the root cause of this and many other data warehousing project
failures is the disconnect in understanding and expectations between devel-
opers and users.

ABOUT THIS BOOK
About the same time I was in the throes of the painful and failing project
just described, I met Jim Highsmith, one of the founding fathers of the Agile
movement, author of Adaptive Software Development, Agile Software Devel-
opment Ecosystems, and Agile Project Management and one of the two series
editors for the Agile Software Development Series of which this book is a
part. Jim listened to my whining about our project difficulties and gave me
much food for thought about how Agile methods might be adapted to DW/BI
systems development. Unfortunately, by the time I met Jim it was too late

PREFACE xxiii

to right that sinking ship. However, since then Jim and I have become good
friends, exchanging ideas over coffee on a mostly weekly basis. Well, mostly
he shares good ideas and I do my best to absorb them. Jim has become my
Agile mentor, and I have devoted my professional life since we first met to
ensuring that I never, ever work on another failing DW/BI project again.
Now that may seem like an audacious goal, but I believe that (a) life is too
short to suffer projects that are doomed to fail; (b) Agile development is the
single best project risk mitigation approach we have at our disposal; and (c)
Agile development is the single best means of innovating high-value, high-
quality, working DW/BI systems that we have available. That’s what this
book is about:

� Mitigating DW/BI project risk
� Innovating high-value DW/BI solutions
� Having fun!

Since my last painful project experience I have had many wonderful oppor-
tunities to adapt Agile development methods to the unique characteristics
of DW/BI systems development. Working with some very talented Agile
DW/BI practitioners, I have successfully adapted, implemented, and refined
a comprehensive set of project management and technical practices to create
the Agile Analytics development method.

This adaptation is nontrivial as there are some very significant and unique
challenges that we face that mainstream software developers do not. DW/BI
developers deal with a hybrid mix of integrating commercial software and
writing some custom code (ETL scripting, SQL, MDX, and application pro-
gramming are common). DW/BI development teams often have a broad and
disparate set of skills. DW/BI development is based on large data volumes
and a complex mixture of operational, legacy, and specialty systems. The
DW/BI systems development platform is often a high-end dedicated server
or server cluster, making it harder to replicate for sandbox development and
testing. For these reasons and more, Agile software development methods
do not always easily transfer to DW/BI systems development, and I have met
a few DW/BI developers who have given up trying. This book will introduce
you to the key technical and project management practices that are essential
to Agile DW/BI. Each practice will be thoroughly explained and demon-
strated in a working example, and I will show you how you might modify
each practice to best fit the uniqueness of your situation.

xxiv PREFACE

This book is written for three broad audiences:

� DW/BI practitioners seeking to learn more about Agile techniques
and how they are applied to the familiar complexities of DW/BI
development. For these readers I provide the details of Agile techni-
cal and project management techniques as they relate to business
intelligence and data-centric projects.

� Agile practitioners who want to know how to apply familiar Agile
practices to the complexities of DW/BI systems development. For
these readers I elaborate upon the traits of business intelligence proj-
ects and systems that make them distinctly different from software
development projects, and I show how to adapt Agile principles and
practices to these unique characteristics.

� IT and engineering management who have responsibility for and
oversight of program portfolios, including data warehousing, busi-
ness intelligence, and analytics projects. This audience may possess
neither deep technical expertise in business intelligence nor exper-
tise in Agile methods. For these readers I present an introduction to
an approach that promises to increase the likelihood of successful
projects and delighted customers.

Although this book isn’t a primer on the fundamentals of DW/BI systems, I
will occasionally digress into coverage of DW/BI fundamentals for the ben-
efit of the second audience. Readers already familiar with business intelli-
gence should feel free to skip over these sections.

By the way, although I’m not an expert in all types of enterprise IT systems,
such as enterprise resource planning (ERP) implementations, I have reason
to believe that the principles and practices that make up Agile Analytics can
be easily adapted to work in those environments as well. If you are an IT
executive, you might consider the broader context of Agile development in
your organization.

WHY AN AGILE DW/BI BOOK?
In the last couple of years the Agile software development movement has
exploded. Agile success stories abound. Empirical evidence continues to
increase and strongly supports Agile software development. The Agile com-
munity has grown dramatically during the past few years, and many large
companies have adopted agility across their IT and engineering depart-
ments. And there has been a proliferation of books published about various
aspects of Agile software development.

PREFACE xxv

Unfortunately, the popularity of Agile methods has been largely lost on the
data and business intelligence communities. For some strange reason the
data community and software development community have always tended
to grow and evolve independently of one another. Big breakthroughs that
occur in one community are often lost on the other. The object-oriented
boom of the 1990s is a classic example of this. The software development
community has reaped the tremendous benefits of folding object orientation
into its DNA, yet object-oriented database development remains peripheral
to the mainstream for the data community.

Whenever I talk to groups of DW/BI practitioners and database developers,
the common reaction is that Agile methods aren’t applicable to data-centric
systems development. Their arguments are wide and varied, and they are
almost always based on myths, fallacies, and misunderstandings, such as
“It is too costly to evolve and change a data model. You must complete the
physical data model before you can begin developing reports and other user
features.”

The reality is that there is nothing special about data-centric systems that
makes Agile principles irrelevant or inappropriate. The challenge is that
Agile practices must be adapted, and a different tool set must be adopted for
data-centric systems development. Although many of the current books on
Agile concepts and techniques are directly relevant to the data community,
most of them do not speak directly to the data-minded reader. Unfortu-
nately, many current Agile books are too narrowly focused on new, green-
field software development using all the latest platforms, frameworks, and
programming languages. It can be difficult for readers to extrapolate the
ideas presented in these books to database development, data warehouse
development, ERP implementation, legacy systems development, and so
forth.

Agile author and database expert Scott Ambler has written books on Agile
database development and database refactoring (a distinctly Agile practice)
to engage the database community in the Agile dialogue. Similarly, I’ve
written this book to engage the DW/BI community in the Agile movement
because Agile is simply a better way to work on large, complex DW/BI sys-
tems. In 2008 Ralph Hughes’s book Agile Data Warehousing hit the shelves
(Hughes 2008). Ralph does a great job of adapting Scrum and eXtreme Pro-
gramming (XP) techniques to the nuances of data warehousing, and many
of those concepts are also present in this book. Additionally, this book aims
to dive into many of the technical practices that are needed to develop in an
Agile manner.

xxvi PREFACE

WHAT DO I MEAN BY AGILE ANALYTICS?
A word about terminology: I’ve chosen the title Agile Analytics more because
it’s catchy and manageable than because it precisely captures my focus. Face
it, Agile Data Warehousing, Business Intelligence, and Analytics would be a
mouthful. By and large the data warehousing community has come to use
the term data warehousing to refer to back-end management and prepara-
tion of data for analysis and business intelligence to refer to the user-facing
front-end applications that present data from the warehouse for analysis.
The term analytics is frequently used to suggest more advanced business
intelligence methods involving quantitative analysis of data (e.g., predic-
tive modeling, statistical analysis, etc.). Moreover, the industry term busi-
ness intelligence is sometimes an ambiguous and broadly encompassing term
that includes anything to do with data-driven business processes (business
performance management, customer relationship management, etc.) or
decision support (scorecards, dashboards, etc.).

My use of the moniker Agile Analytics should not imply that Agile meth-
ods are applicable only to a certain class of user-facing BI application devel-
opment. Agile methods are applicable and adaptable to data warehouse
development as well as business intelligence and analytical application
development. For many people Agile BI development tends to be easier to
imagine, since it is often assumed that the data warehouse has been built
and populated. Certainly a preexisting data warehouse simplifies the effort
required to build BI applications. However, you should not take this to
mean that the data warehouse must be completed prior to building BI appli-
cations. In fact, Agile Analytics is a user-value–driven approach in which
high-valued BI capabilities drive the evolutionary development of the data
warehouse components needed to support those capabilities. In this way
we avoid overbuilding the warehouse to support more than its intended
purpose.

In this book I focus primarily on the core of most flavors of DW/BI systems,
the data warehouse. My use of the term business intelligence or BI through-
out this book should be assumed to include analytic as well as reporting and
querying applications. When I use the term DW/BI system, you should infer
that I mean the core data warehouse along with any presentation applica-
tions that are served by the warehouse such as a finance dashboard, a fore-
casting portal, or some other BI application. However, the DW/BI acronym
is somewhat clunky, and I may occasionally use BI alone. In most of these
cases you should assume that I mean to include relevant DW components
as well. I’ll also address some of the advanced BI concepts like data mining

PREFACE xxvii

and data visualization. I’ll leave it to the reader to extrapolate the practices
to more specific BI projects such as CRM implementations. The principles
still apply.

WHO SHOULD READ THIS BOOK?
An Agile DW/BI team is made up of more than just developers. It includes
the customer (user) community, who provide requirements; the business
stakeholder community, who are monitoring the impact of the BI system on
business improvements; and the technical community, who develop, deploy,
and support the DW/BI system. These communities are connected by a
project manager, a business analyst (or product owner), and an executive
sponsor. Each of these communities plays a crucial role in project success,
and each of these communities requires a well-defined set of Agile practices
to be effective in its role. This book is intended for both business and techni-
cal readers who are involved in one or more of the communities described.

Not everything in the book is meant for everyone on the list, but there is
something here for everyone. I have worked with many organizations that
seek Agile training, mentoring, and coaching. Occasionally I have to dispel
the myth that agility applies only to developers and techies.

At one company with which I was invited to work, the executive who spon-
sored the training said something like, “If our engineers could just start
doing Agile development, we could finish projects faster and our customers
would be happier.” This statement represents some unfortunate misconcep-
tions that can be a buzzkill for Agile teams.

First, successful agility requires a change in the mind-set of all team mem-
bers. Customer community members must understand that their time is
required to explore and exercise newly completed features, and to provide
continuous input and feedback on the same. Management community
members must adapt their expectations as project risk and uncertainty
unfolds, and as the team adapts to inevitable change. The technical com-
munity must learn a whole new way of working that involves lots of disci-
pline and rigor. And the project interface community must be committed
to daily project involvement and a shift in their role and contribution to
project success.

Second, Agile doesn’t always mean faster project completion. Even the best
project teams still have a finite capacity to complete a scope of work. Agility
is not a magic wand that makes teams work faster. Agile practices do steer

xxviii PREFACE

teams to focus on the high-value and riskiest features early. Therefore, it is
possible that an Agile DW/BI system can be launched into production ear-
lier, as soon as the most critical features are complete and accepted. How-
ever, I would caution against expecting significantly faster project cycles,
especially in the beginning. On the other hand, you should expect a signifi-
cant increase in quality and customer delight over traditional DW/BI devel-
opment approaches.

The bottom line is that successful adoption of Agile DW/BI requires aware-
ness, understanding, and commitment from the members of all of the
aforementioned project communities. For this reason I have tried to design
this book to provide something relevant for everyone.

HOW THIS BOOK IS ORGANIZED
This book is divided into two parts. Part I, “Agile Analytics: Management
Methods,” is focused on Agile project management techniques and delivery
team coordination. It includes the following chapters:

� Chapter 1, “Introducing Agile Analytics,” provides an overview and
baseline for this DW/BI approach.

� Chapter 2, “Agile Project Management,” introduces an effective col-
lection of practices for chartering, planning, executing, and moni-
toring an Agile Analytics project.

� Chapter 3, “Community, Customers, and Collaboration,” introduces
a set of guidelines and practices for establishing a highly collabora-
tive project community.

� Chapter 4, “User Stories for BI Systems,” introduces the story-driven
alternative to traditional requirements analysis and shows how use
cases and user stories drive the continuous delivery of value.

� Chapter 5, “Self-Organizing Teams Boost Performance,” introduces
an Agile style of team management and leadership as an effective
alternative to more traditional command-and-control styles.

This first part is written for everyone involved in an Agile Analytics proj-
ect, from executive sponsors, to project managers, to business analysts and
product owners, to technical leads and delivery team members. These chap-
ters establish a collection of core practices that shape the way an Agile proj-
ect community works together toward a successful conclusion.

Part II of the book, “Agile Analytics: Technical Methods,” is focused on
the technical methods that are necessary to enable continuous delivery of

PREFACE xxix

business value at production-quality levels. This part includes the following
chapters:

� Chapter 6, “Evolving Excellent Design,” shows how the evolutionary
design process works and how to ensure that it results in higher-
quality data models and system components with minimal technical
debt.

� Chapter 7, “Test-Driven Data Warehouse Development,” introduces
a collection of practices and tools for automated testing, and for
taking a test-first approach to building data warehouse and business
intelligence components.

� Chapter 8, “Version Control for Data Warehousing,” introduces a set
of techniques and tools for keeping the entire DW/BI system under
version control and configuration management.

� Chapter 9, “Project Automation,” shows how to combine test
automation and version control practices to establish an automated
continuous integration environment that maintains confidence in
the quality of the evolving system.

� Chapter 10, “Final Words,” takes a look at some of the remaining
factors and considerations that are critical to the successful adoption
of an Agile Analytics approach.

I think of this part as a collection of modern development practices that
should be used on every DW/BI project, be it Agile or traditional (e.g.,
“waterfall”). However, these technical practices are essential when an Agile
Analytics approach is taken. These methods establish the minimally suf-
ficient set of technical practices needed to succeed in the continuous, incre-
mental, and evolutionary delivery of a high-value DW/BI system.

Of course, these technical chapters should be read by technical team leads
and delivery team members. However, I also recommend that nontechnical
project team members read the introductory sections of each of these chap-
ters. Doing so will help nontechnical members establish a shared under-
standing of the purpose of these practices and appreciate the value of the
technical team’s efforts to apply them.

HOW SHOULD YOU READ THIS BOOK?
I like to think of Agile Analytics techniques as supporting one of the follow-
ing focal points:

xxx PREFACE

� Agile DW/BI management: the set of practices that are devoted to
how you run your project, including precursors to agility, Agile proj-
ect management methods, the Agile team, developer-user interface,
and so on

� Agile DW/BI technical methods: the set of practices that are
devoted to the development and delivery of a high-value, high-
quality, working DW/BI system, including specific technical prac-
tices like story-driven development, test-driven development, build
automation, code management, refactoring, and so on

The chapters are organized into these major sections. Each chapter is dedi-
cated to a key practice or related set of practices, beginning with an execu-
tive-level overview of the salient points of the chapter and progressing into
deeper coverage of the topic. Some of the chapter topics are rich enough to
deserve to be entire books. In these cases, my aim is to give the reader a solid
understanding of the topic, and ideally the motivation needed for a deeper
self-study of its mechanics.

If you are reading this to gain a high-level understanding of Agile DW/BI,
the initial overview at the beginning of each chapter will suffice. My goal in
these overviews is to provide an accurate portrayal of each of the Agile DW/
BI practices, but these sections aren’t intended to give you all the techniques
needed to apply the practice.

If you are a data warehouse manager, project sponsor, or anyone who needs
to have a good working understanding of the practices without getting
bogged down in the technical details, I recommend reading the middle sec-
tions of each chapter, especially the project management chapters. These
sections are designed to provide a deep enough understanding of the topic to
either use the techniques or understand how they are used on your project.

If you are a member of the day-to-day project team (project managers,
technical team members, business analysts, product managers, etc.), I rec-
ommend reading the details and examples in each of the project manage-
ment chapters (Part I, “Agile Analytics: Management Methods”). These are
designed to give you a concrete set of techniques to apply in your release
planning, iteration planning, and all other project management and user
collaboration activities. If you are a member of the technical community,
the chapters in Part II, “Agile Analytics: Technical Methods,” are intended
for you.

PREFACE xxxi

A word about DW/BI technologies: I am a technology agnostic. I have
done DW/BI development using a variety of technology stacks that are
IBM-DB2-centric, Oracle-centric, SAS-centric, and Microsoft-centric, as
well as a variety of hybrid technology stacks. While some technologies may
lend themselves to Agile DW/BI better than others, I am confident that the
guiding principles and practices introduced in this book are technology-
independent and can be effective regardless of your tool choices.

As this book goes to press, there are an increasing number of data ware-
house and business intelligence tool vendors that are branding their prod-
ucts as Agile. Tools and tool suites from forward-thinking vendors such
as WhereScape, Pentaho, Balanced Insight, and others offer some exciting
possibilities for enabling agility. While I do not believe that you must have
these types of tools to take an Agile approach, they certainly do offer some
powerful benefits to Agile delivery teams. The Agile software development
community has greatly benefited from tools that help automate difficult
development activities, and I look forward to the benefits that our com-
munity stands to gain from these vendors. At the same time I would cau-
tion you not to believe that you must have such tools before you can start
being Agile. Instead, I encourage you to get started with Agile techniques
and practices and adopt tools incrementally as you determine that they are
of sufficient benefit.

This page intentionally left blank

xxxiii

ACKNOWLEDGMENTS

I would never have gotten the experience and knowledge I needed to write
this book without the contributions of several key people. These friends and
colleagues have my respect and gratitude for the many valuable interactions
I’ve had with them, and the collaborations that ultimately resulted in the
Agile Analytics approach.

Foremost, my good friend Jim Highsmith has been my trusted adviser and
mentor since the beginning of my Agile journey. Jim was just starting to write
the first edition of Agile Project Management when I first met him, and he
made book-writing look so easy that I decided to give it a try. As it turns out,
it’s much harder than he makes it look. My weekly breakfast discussions with
Jim were critical in shaping the concepts in this book. He voluntarily served
as my developmental editor, reviewing early drafts of sections and chapters
and helping me pull things together in a more cohesive and coherent fashion.
Jim continues to challenge my assumptions and gives me new ideas and new
ways to think about the complexities of development. He also didn’t give up
on me when book-writing wasn’t my highest priority. Thanks, Jim.

Jim introduced me to Luke Hohmann at a time when Luke was looking
for somebody with both data warehousing experience and Agile knowl-
edge. Luke is one of the most visionary people I’ve ever met. I was fortu-
nate enough to be the chief architect for one of Luke’s innovative ideas: a
complex, hosted, enterprise DW/BI product offering from one of Luke’s
clients. The complexity of this project and Luke’s deep knowledge of Agile
techniques challenged me (and our team) to figure out how to apply Agile
software methods to the nuances of DW/BI development. The concepts in
this book stem from that experience and have been refined and matured
on subsequent projects. Luke has become a great friend over the past seven
years, and I value his wisdom and vision. Thanks, Luke.

My team on the aforementioned project remains one of the best Agile teams
I have yet experienced either as a participant or as an Agile trainer. This
team included David Brink, Robert Daugherty, James Slebodnick, Scott
Gilbert, Dan O’Leary, Jonathon Golden, and Ricardo Aguirre. Each team
member brought a special set of skills and perspectives, and over that first
three-plus-year-long project these friends and teammates helped me figure

xxxiv ACKNOWLEDGMENTS

out effective ways to apply Agile techniques to DW/BI development. I’ve
since had other project opportunities to work with many of these friends,
further refining Agile Analytics concepts. These team members deserve
much of the credit for validating and tweaking Agile Analytics practices in a
complex and real-life situation. Thanks, guys.

Jim Highsmith also introduced me to Scott Ambler along the way. Scott has
led the charge in applying Agile to data-centric systems development. For-
tunately for all of us, Scott is a prolific writer who freely shares his ideas
in his many books and on his ambysoft.com Web site. I have benefited
greatly from the conversations I’ve had with him, as well as from his writ-
ings on Agile Modeling, Agile Data, Agile Unified Process, and Database
Refactoring (together with Pramod Sadalage). In the early days of my focus
on Agile in DW/BI, Scott and I regularly lamented our perceptions that the
data community wasn’t paying attention to the benefits of agility, while the
software community wasn’t paying attention to the unique challenges of
database development and systems integration. Scott gave much of his time
reviewing this book. He has given me much to think about and shared ideas
with me that I might otherwise have missed. Thanks, Scott.

I don’t think I truly understood what it means for somebody to have “the
patience of a saint” before working with Addison-Wesley editor Chris
Guzikowski and editorial assistant Raina Chrobak. As it turns out, I am
a painfully slow author who is not very good at applying Agile principles
to book-writing deadlines. Huge thanks go to Raina and Chris, who were
amazingly patient as I slipped deadline after deadline. I hope I have future
opportunities to redeem myself as an author.

Ralph Hughes’s Agile Data Warehousing book hit the shelves as I was writing
this book. Ralph and I were acquainted at that time and since have become
friends and colleagues. I am grateful for his work in this area and for the dis-
cussions I’ve had with him and the experiences he has shared. Although I have
tried not to duplicate what Ralph has already published, I am confident that
our approaches are consistent with and complementary to one another. I look
forward to future collaborations with Ralph as our ideas mature and evolve.

Finally, the ideas presented in this book have benefited tremendously from
smart and thoughtful people willing to review its early drafts and give me
guidance. In addition to Scott’s and Jim’s reviews, special thanks go to Jona-
thon Golden, my go-to guru on project automation, and Israel Gat, expert
on Agile leadership and technical debt. My gratitude also goes to DW/
BI experts Wayne Eckerson and Dale Zinkgraf and to Agile data expert
Pramod Sadalage for their feedback. Their contributions were invaluable.

xxxv

ABOUT THE AUTHOR

Ken Collier got excited about Agile development in 2003 and was one of
the first to start combining Agile methods with data warehousing, business
intelligence, and analytics. These disciplines present a unique set of chal-
lenges to the incremental/evolutionary style of Agile development. Ken has
successfully adapted Agile techniques to data warehousing and business
intelligence to create the Agile Analytics style. He continues to refine these
ideas as a technical lead and project manager on several Agile DW/BI proj-
ect teams. Ken also frequently trains data warehousing and business intel-
ligence teams in Agile Analytics, giving him the opportunity to exercise this
approach with various technologies, team dynamics, and industry domains.
He has been an invited keynote speaker on the subject of Agile DW/BI at
several U.S. and international conferences, including multiple TDWI (The
Data Warehousing Institute) World Conferences as well as HEDW (Higher
Education Data Warehousing) annual conferences.

In nearly three decades of working in advanced computing and technology, Ken
has experienced many of the trends that come and go in our field, as well as the
ones that truly transform the state of our practices. With an M.S. and Ph.D. in
computer science engineering, Ken is formally trained in software engineering,
data management, and machine learning. He loves challenging problems in the
areas of systems architecture and design, systems/software development life-
cycles, project leadership, data warehousing, business intelligence, and advanced
analytics. Ken also loves helping organizations adopt and tailor effective
approaches and solutions that might not otherwise be apparent. He combines a
deep technical foundation with sound business acumen to help bridge the gaps
that often exist between technical and business professionals.

Ken is the founder and president of KWC Technologies, Inc., and is a senior
consultant with the Cutter Consortium in both the Agile Development and
Business Intelligence practice areas. Ken has had the privilege of working as
a software engineer for a large semiconductor company. He has spent sev-
eral years as a tenured professor of computer science engineering. He has
directed the data warehousing and business intelligence solutions group for
a major consulting firm. And, most recently, he has focused on enabling
organizational agility, including Agile software engineering, Agile Analyt-
ics, and Agile management and leadership for client companies.

This page intentionally left blank

3

Chapter 1

INTRODUCING AGILE ANALYTICS

Like Agile software development, Agile Analytics is established on a set of
core values and guiding principles. It is not a rigid or prescriptive methodol-
ogy; rather it is a style of building a data warehouse, data marts, business
intelligence applications, and analytics applications that focuses on the early
and continuous delivery of business value throughout the development life-
cycle. In practice, Agile Analytics consists of a set of highly disciplined prac-
tices and techniques, some of which may be tailored to fit the unique data
warehouse/business intelligence (DW/BI) project demands found in your
organization.

Agile Analytics includes practices for project planning, management, and
monitoring; for effective collaboration with your business customers and
management stakeholders; and for ensuring technical excellence by the
delivery team. This chapter outlines the tenets of Agile Analytics and estab-
lishes the foundational principles behind each of the practices and tech-
niques that are introduced in the successive chapters in this book.

Agile is a reserved word when used to describe a development style. It means
something very specific. Unfortunately, “agile” occasionally gets misused as
a moniker for processes that are ad hoc, slipshod, and lacking in discipline.
Agile relies on discipline and rigor; however, it is not a heavyweight or highly
ceremonious process despite the attempts of some methodologists to codify
it with those trappings. Rather, Agile falls somewhere in the middle between
just enough structure and just enough flexibility. It has been said that Agile
is simple but not easy, describing the fact that it is built on a simple set of
sensible values and principles but requires a high degree of discipline and
rigor to properly execute. It is important to accurately understand the mini-
mum set of characteristics that differentiate a true Agile process from those
that are too unstructured or too rigid. This chapter is intended to leave you
with a clear understanding of those characteristics as well as the underlying
values and principles of Agile Analytics. These are derived directly from the
tried and proven foundations established by the Agile software community
and are adapted to the nuances of data warehousing and business intelli-
gence development.

4 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

ALPINE-STYLE SYSTEMS DEVELOPMENT
I’m a bit of an armchair climber and mountaineer. I’m fascinated by the
trials and travails of climbing high mountains like Everest, Annapurna,
and others that rise to over 8,000 meters above sea level. These expeditions
are complicated affairs involving challenging planning and logistics, a high
degree of risk and uncertainty, a high probability of death (for every two
climbers who reach the top of Annapurna, another one dies trying!), diffi-
cult decisions in the face of uncontrollable variables, and incredible rewards
when success is achieved. While it may not be as adventuresome, building
complex business intelligence systems is a lot like high-altitude climbing.
We face lots of risk and uncertainty, complex planning, difficult decisions
in the heat of battle, and the likelihood of death! Okay, maybe not that last
part, but you get the analogy. Unfortunately the success rate for building
DW/BI systems isn’t very much better than the success rate for high-altitude
mountaineering expeditions.

Climbing teams first began successfully “conquering” these high mountains
in the 1950s, ’60s, and ’70s. In those early days the preferred mountaineer-
ing style was known as “siege climbing,” which had a lot of similarities to a
military excursion. Expeditions were led in an autocratic command-and-
control fashion, often by someone with more military leadership experi-
ence than climbing experience. Climbing teams were supported by the large
numbers of porters required to carry massive amounts of gear and supplies
to base camp and higher. Mounting a siege-style expedition takes over a
year of planning and can take two months or more to execute during the
climbing season. Siege climbing is a yo-yo-like affair in which ropes are
fixed higher and higher on the mountain, multiple semipermanent camps
are established at various points along the route, and loads of supplies are
relayed by porters to those higher camps. Finally, with all this support, a
small team of summit climbers launches the final push for the summit on a
single day, leaving from the high camp and returning to the same. Brilliant
teams have successfully climbed mountains for years in this style, but the
expeditions are prohibitively expensive, time-consuming to execute, and
fraught with heavyweight procedures and bureaucracy.

Traditional business intelligence systems development is a lot like siege climb-
ing. It can result in high-quality, working systems that deliver the desired
capabilities. However, these projects are typically expensive, exhibiting a lot
of planning, extensive design prior to development, and long development
cycles. Like siege-style expeditions, all of the energy goes into one shot at the
summit. If the summit bid fails, it is too time-consuming to return to base
camp and regroup for another attempt. In my lifetime (and I’m not that old

ALPINE-STYLE SYSTEMS DEVELOPMENT 5

yet) I’ve seen multiple traditional DW/BI projects with budgets of $20 mil-
lion or more, and timelines of 18 to 24 months, founder. When such projects
fail, the typical management response is to cancel the project entirely rather
than adjust, adapt, and regroup for another “summit attempt.”

In the 1970s a new mountaineering method called “alpine-style” emerged,
making it feasible for smaller teams to summit these high peaks faster, more
cheaply, and with less protocol. Alpine-style mountaineering still requires
substantial planning, a sufficient supporting team, and enough gear and
supplies to safely reach the summit. However, instead of spending months
preparing the route for the final summit push, alpine-style climbers spend
about a week moving the bare essentials up to the higher camps. In this style,
if conditions are right, summits can be reached in a mere ten days. Teams
of two to three climbers share a single tent and sleeping bag, fewer ropes are
needed, and the climbers can travel much lighter and faster. When condi-
tions are not right, it is feasible for alpine-style mountaineers to return to
base camp and wait for conditions to improve to make another summit bid.

Agile DW/BI development is much like alpine-style climbing. It is essential
that we have a sufficient amount of planning, the necessary support to be
successful, and an appropriate amount of protocol. Our “summit” is the
completion of a high-quality, working business intelligence system that is of
high value to its users. As in mountaineering, reaching our summit requires
the proper conditions. We need just the right amount of planning—but we
must be able to adapt our plan to changing factors and new information.
We must prepare for a high degree of risk and uncertainty—but we must be
able to nimbly manage and respond as risks unfold. We need support and
involvement from a larger community—but we seek team self-organization
rather than command-and-control leadership.

Agile Analytics is a development “style” rather than a methodology or even
a framework. The line between siege-style and alpine-style mountaineering is
not precisely defined, and alpine-style expeditions may include some siege-style
practices. Each style is best described in terms of its values and guiding prin-
ciples. Each alpine-style expedition employs a distinct set of climbing practices
that support a common set of values and principles. Similarly, each Agile DW/
BI project team must adapt its technical, project management, and customer
collaboration practices to best support the Agile values and principles.1

1. I’m not the first Agile advocate to discuss the analogy between climbing and Agile
development. Jim Highsmith made a similar analogy in his 2000 book, Adaptive
Software Development: A Collaborative Approach to Managing Complex Systems
(Highsmith 2000).

6 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

Premier mountaineer Ed Viesturs has a formula, or core value, that is his
cardinal rule in the big mountains: “Getting to the top is optional. Get-
ting down is mandatory.” (Viesturs and Roberts 2006) I love this core value
because it is simple and elegant, and it provides a clear basis for all of Ed’s
decision making when he is on the mountain. In the stress of the climb, or
in the midst of an intensely challenging project, we need just such a basis for
decision making—our “North Star.” In 2000, a group of the most influen-
tial application software developers convened in Salt Lake City and formed
the Agile Alliance. Through the process of sharing and comparing each
of their “styles” of software development, the Agile Manifesto emerged as
a simple and elegant basis for project guidance and decision making. The
Agile Manifesto reads:2

Manifesto for Agile Software Development

We are uncovering better ways of developing software by doing it and
helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the
items on the left more.

With due respect to the Agile Alliance, of which I am a member, I have
adapted the Agile Manifesto just a bit in order to make it more appropriate
to Agile Analytics:

Manifesto for Agile Analytics Development

We are uncovering better ways of developing data warehousing and business
intelligence systems by doing it and helping others do it. Through this work
we have come to value:

Individuals and interactions over processes and tools
Working DW/BI systems over comprehensive documentation
End-user and stakeholder collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the
left more.

2. www.agilealliance.org

WHAT IS AGILE ANALYTICS? 7

I didn’t want to mess with the original manifesto too much, but it is impor-
tant to acknowledge that DW/BI systems are fundamentally different from
application software. In addition to dealing with large volumes of data,
our efforts involve systems integration, customization, and programming.
Nonetheless, the Agile core values are very relevant to DW/BI systems devel-
opment. These values emphasize the fact that our primary objective is the
creation of high-quality, high-value, working DW/BI systems. Every activity
related to any project either (a) directly and materially contributes to this
primary objective or (b) does not. Agile Analytics attempts to maximize
a-type activities while acknowledging that there are some b-type activities
that are still important, such as documenting your enterprise data model.

WHAT IS AGILE ANALYTICS?
Throughout this book I will introduce you to a set of Agile DW/BI prin-
ciples and practices. These include technical, project management, and user
collaboration practices. I will demonstrate how you can apply these on your
projects, and how you can tailor them to the nuances of your environment.
However, the title of this section is “What Is Agile Analytics?” so I should
probably take you a bit further than the mountaineering analogy.

Here’s What Agile Analytics Is

So here is a summary of the key characteristics of Agile Analytics. This is
simply a high-level glimpse at the key project traits that are the mark of agil-
ity, not an exhaustive list of practices. Throughout the remainder of this
book I will introduce you to a set of specific practices that will enable you to
achieve agility on your DW/BI projects. Moreover, Agile Analytics is a devel-
opment style, not a prescriptive methodology that tells you precisely what
you must do and how you must do it. The dynamics of each project within
each organization require practices that can be tailored appropriately to
the environment. Remember, the primary objective is a high-quality, high-
value, working DW/BI system. These characteristics simply serve that goal:

� Iterative, incremental, evolutionary. Foremost, Agile is an iterative,
incremental, and evolutionary style of development. We work in
short iterations that are generally one to three weeks long, and never
more than four weeks. We build the system in small increments or
“chunks” of user-valued functionality. And we evolve the working
system by adapting to frequent user feedback. Agile development is
like driving around in an unfamiliar city; you want to avoid going
very far without some validation that you are on the right course.

8 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

Short iterations with frequent user reviews help ensure that we are
never very far off course in our development.

� Value-driven development. The goal of each development itera-
tion is the production of user-valued features. While you and I may
appreciate the difficulty of complex data architectures, elegant data
models, efficient ETL scripts, and so forth, users generally couldn’t
care less about these things. What users of DW/BI systems care
about is the presentation of and access to information that helps
them either solve a business problem or make better business deci-
sions. Every iteration must produce at least one new user-valued
feature in spite of the fact that user features are just the tip of the
architectural iceberg that is a DW/BI system.

� Production quality. Each newly developed feature must be fully
tested and debugged during the development iteration. Agile devel-
opment is not about building hollow prototypes; it is about incre-
mentally evolving to the right solution with the best architectural
underpinnings. We do this by integrating ruthless testing early and
continuously into the DW/BI development process.3 Developers
must plan for and include rigorous testing in their development
process. A user feature is “Done” when it is of production quality, it
is successfully integrated into the evolving system, and developers
are proud of their work. That same feature is “Done! Done!” when
the user accepts it as delivering the right value.

� Barely sufficient processes. Traditional styles of DW/BI develop-
ment are rife with a high degree of ceremony. I’ve worked on many
projects that involved elaborate stage-gate meetings between stages
of development such as the transition from requirements analysis
to design. These gates are almost always accompanied by a formal
document that must be “signed off” as part of the gating process.
In spite of this ceremony many DW/BI projects struggle or founder.
Agile DW/BI emphasizes a sufficient amount of ceremony to meet
the practical needs of the project (and future generations) but noth-
ing more. If a data dictionary is deemed important for use by future
developers, then perhaps a digital image of a whiteboard table or a
simple spreadsheet table will suffice. Since our primary objective
is the production of high-quality, high-value, working systems, we
must be able to minimize the amount of ceremony required for
other activities.

3. Historically database and data warehouse testing has lacked the rigor, discipline, and
automation that have benefited software development efforts (www.ambysoft.com/
surveys/dataQualitySeptember2006.html).

WHAT IS AGILE ANALYTICS? 9

� Automation, automation, automation. The only way to be truly
Agile is to automate as many routine processes as possible. Test
automation is perhaps the most critical. If you must test your fea-
tures and system manually, guess how often you’re likely to rerun
your tests? Test automation enables you to frequently revalidate that
everything is still working as expected. Build automation enables
you to frequently build a version of your complete working DW/BI
system in a demo or preproduction environment. This helps estab-
lish continuous confidence that you are never more than a few hours
or days away from putting a new version into production. Agile
Analytics teams seek to automate any process that is done more
than once. The more you can automate, the more you can focus on
developing user features.

� Collaboration. Too often in traditional projects the development
team solely bears the burden of ensuring that timelines are met,
complete scope is delivered, budgets are managed, and quality is
ensured. Agile business intelligence acknowledges that there is a
broader project community that shares responsibility for project
success. The project community includes the subcommunities of
users, business owners, stakeholders, executive sponsors, techni-
cal experts, project managers, and others. Frequent collaboration
between the technical and user communities is critical to success.
Daily collaboration within the technical community is also critical.
In fact, establishing a collaborative team workspace is an essential
ingredient of successful Agile projects.

� Self-organizing, self-managing teams. Hire the best people, give
them the tools and support they need, then stand aside and allow
them to be successful. There is a key shift in the Agile project man-
agement style compared to traditional project management. The
Agile project manager’s role is to enable team members to work their
magic and to facilitate a high degree of collaboration with users and
other members of the project community. The Agile project team
decides how much work it can complete during an iteration, then
holds itself accountable to honor those commitments. The Agile
style is not a substitute for having the right people on the team.

Guiding Principles

The core values contained in the Agile Manifesto motivate a set of guid-
ing principles for DW/BI systems design and development. These prin-
ciples often become the tiebreaker when difficult trade-off decisions must
be made. Similarly, the Agile Alliance has established a set of principles for

10 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

software development.4 The following Agile Analytics principles borrow
liberally from t he Agile Alliance principles:

� Our highest priority is to satisfy the DW/BI user community
through early and continuous delivery of working user features.

� We welcome changing requirements, even late in development.
Agile processes harness change for the DW/BI users’ competitive
advantage.

� We deliver working software frequently, providing users with new
DW/BI features every few weeks.

� Users, stakeholders, and developers must share project ownership
and work together daily throughout the project.

� We value the importance of talented and experienced business intel-
ligence experts. We give them the environment and support they
need and trust them to get the job done.

� The most efficient and effective method of conveying information
to and within a development team is face-to-face conversation.

� A working business intelligence system is the primary measure of
progress.

� We recognize the balance among project scope, schedule, and cost.
The data warehousing team must work at a sustainable pace.

� Continuous attention to the best data warehousing practices
enhances agility.

� The best architectures, requirements, and designs emerge from self-
organizing teams.

� At regular intervals, the team reflects on how to become more effec-
tive, then tunes and adjusts its behavior accordingly.

Take a minute to reflect on these principles. How many of them are present
in the projects in your organization? Do they make sense for your organiza-
tion? Give them another look. Are they realistic principles for your organi-
zation? I have found these not only to be commonsense principles, but also
to be effective and achievable on real projects. Furthermore, adherence to
these principles rather than reliance on a prescriptive and ceremonious pro-
cess model is very liberating.

Myths and Misconceptions

There are some myths and misconceptions that seem to prevail among
other DW/BI practitioners and experts that I have talked to about this style

4. www.agilemanifesto.org/principles.html

WHAT IS AGILE ANALYTICS? 11

of development. I recently had an exchange on this topic with a seasoned
veteran in both software development and data warehousing who is certi-
fied at the mastery level in DW/BI and data management and who has man-
aged large software development groups. His misunderstanding of Agile
development made it evident that myths and misconceptions abound even
among the most senior DW/BI practitioners. Agile Analytics is not:

� A wholesale replacement of traditional practices. I am not suggest-
ing that everything we have learned and practiced in the short his-
tory of DW/BI systems development is wrong, and that Agile is the
new savior that will rescue us from our hell. There are many good
DW/BI project success stories, which is why DW/BI continues to
be among the top five strategic initiatives for most large companies
today. It is important that we keep the practices and methods that
work well, improve those that allow room for improvement, and
replace those that are problematic. Agile Analytics seeks to modify
our general approach to DW/BI systems development without dis-
carding the best practices we’ve learned on our journey so far.

� Synonymous with Scrum or eXtreme Programming (XP). Scrum is
perhaps the Agile flavor that has received the most publicity (along
with XP) in recent years. However, it is incorrect to say that “Agile
was formerly known as eXtreme Programming,” as one skeptic
told me. In fact, there are many different Agile development flavors
that add valuable principles and practices to the broader collective
known as Agile development. These include Scrum, Agile Model-
ing, Agile Data, Crystal, Adaptive, DSDM, Lean Development,
Feature Driven Development, Agile Project Management (APM),
and others.5 Each is guided by the core values expressed in the
Agile Manifesto. Agile Analytics is an adaptation of principles and
practices from a variety of these methods to the complexities of
data-intensive, analytics-based systems integration efforts like data
warehousing and data mart development.

� Simply iterating. Short, frequent development iterations are an
essential cornerstone of Agile development. Unfortunately, this key
practice is commonly misconstrued as the definition of agility. Not
long ago I was asked to mentor a development team that had “gone
Agile” but wasn’t experiencing the expected benefits of agility. Upon
closer inspection I discovered that they were planning in four-week
“iterations” but didn’t expect to have any working features until

5. For a great survey of the various Agile f lavors I highly recommend reading Agile
Software Development Ecosystems (Highsmith 2002).

12 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

about the sixth month of the project. Effectively they had divided
the traditional waterfall model into time blocks they called itera-
tions. They completely missed the point. The aim of iterative devel-
opment is to demonstrate working features and to obtain frequent
feedback from the user community. This means that every iteration
must result in demonstrable working software.

� For systems integration; it’s only for programming. Much of
our effort in DW/BI development is focused on the integration of
multiple commercial tools, thereby minimizing the volume of raw
programming required. DW/BI tool vendors would have us believe
that DW/BI development is simply a matter of hooking up the tools
to the source systems and pressing the “Go” button. You’ve probably
already discovered that building an effective DW/BI system is not
that simple. A DW/BI development team includes a heterogeneous
mixture of skills, including extraction, transformation, load (ETL)
development; database development; data modeling (both relational
and multidimensional); application development; and others. In
fact, compared to the more homogeneous skills required for appli-
cations development, DW/BI development is quite complex in this
regard. This complexity calls for an approach that supports a high
degree of customer collaboration, frequent delivery of working soft-
ware, and frequent feedback—aha, an Agile approach!

� An excuse for ad hoc behavior. Some have mistaken the tenets of
Agile development for abandonment of rigor, quality, or structure,
in other words, “hacking.” This misperception could not be farther
from the truth. Agility is a focus on the frequent delivery of high-
value, production-quality, working software to the user community
with the goal of continuously adapting to user feedback. This means
that automated testing and quality assurance are critical compo-
nents of all iterative development activities. We don’t build proto-
types; we build working features and then mature those features
in response to user input. Others mistake the Agile Manifesto as
disdain of documentation, which is also incorrect. Agile DW/BI
seeks to ensure that a sufficient amount of documentation is pro-
duced. The keyword here is sufficient. Sufficiency implies that there
is a legitimate purpose for the document, and when that purpose is
served, there is no need for additional documentation.

In my work with teams that are learning and adopting the Agile DW/BI
development style, I often find that they are looking for a prescriptive meth-
odology that makes it very clear which practices to apply and when. This is
a natural inclination for new Agile practitioners, and I will provide some

DATA WAREHOUSING ARCHITECTURES AND SKILL SETS 13

recommendations that may seem prescriptive in nature. In fact you may
benefit initially by creating your own “recipe” for the application of Agile
DW/BI principles and practices. However, I need to reemphasize that Agile
Analytics is a style, not a methodology and not a framework. Figuratively,
you can absorb agility into your DNA with enough focus, practice, and
discipline. You’ll know you’ve reached that point when you begin applying
Agile principles to everything you do such as buying a new car, remodeling
a bathroom, or writing a book.

DATA WAREHOUSING ARCHITECTURES AND SKILL SETS
To ensure that we are working from a common understanding, here is a very
brief summary of data warehouse architectures and requisite skill sets. This
is not a substitute for any of the more comprehensive technical books on
data warehousing but should be sufficient as a baseline for the remainder of
the book.

Data Warehousing Conceptual Architectures

Figure 1.1 depicts an abstracted classical data warehousing architecture and
is suitable to convey either a Kimball-style (Kimball and Ross 2002) or an
Inmon-style (Inmon 2005) architecture. This is a high-level conceptual
architecture containing multiple layers, each of which includes a complex
integration of commercial technologies, data modeling and manipulation,
and some custom code.

The data warehouse architecture includes one or more operational source
systems from which data is extracted, transformed, and loaded into the data

Metadata Management

ETL

ODS

EII

ETL

Central
Warehouse

Data Mining

Scorecards
&

Dashboards

OLAP

Visualizaion
Reports

Data Marts

Source Systems

Legacy
Mainframe

CRM/ERP

Flat Files

External Data

Intergration Presentation Analysis

Figure 1.1 Classical data warehouse architecture

14 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

warehouse repositories. These systems are optimized for the daily trans-
actional processing required to run the business operations. Most DW/
BI systems source data from multiple operational systems, some of which
are legacy systems that may be several decades old and reside on older
technologies.

Data from these sources is extracted into an integration tier in the architec-
ture that acts as a “holding pen” where data can be merged, manipulated,
transformed, cleansed, and validated without placing an undue burden on
the operational systems. This tier may include an operational data store or
an enterprise information integration (EII) repository that acts as a system
of record for all relevant operational data. The integration database is typi-
cally based on a relational data model and may have multiple subcompo-
nents, including pre-staging, staging, and an integration repository, each
serving a different purpose relating to the consolidation and preprocessing
of data from disparate source systems. Common technologies for staging
databases are Oracle, IBM DB2, Microsoft SQL Server, and NCR Teradata.
The DW/BI community is beginning to see increasing use of the open-
source database MySQL for this architectural component.

Data is extracted from the staging database, transformed, and loaded into a
presentation tier in the architecture that contains appropriate structures for
optimized multidimensional and analytical queries. This system is designed
to support the data slicing and dicing that define the power of a data ware-
house. There are a variety of alternatives for the implementation of the
presentation database, including normalized relational schemas and denor-
malized schemas like star, snowflake, and even “starflake.” Moreover, the
presentation tier may include a single enterprise data warehouse or a col-
lection of subject-specific data marts. Some architectures include a hybrid
of both of these. Presentation repositories are typically implemented in the
same technologies as the integration database.

Finally, data is presented to the business users at the analysis tier in the
architecture. This conceptual layer in the system represents the variety of
applications and tools that provide users with access to the data, including
report writers, ad hoc querying, online analytical processing (OLAP), data
visualization, data mining, and statistical analysis. BI tool vendors such as
Pentaho, Cognos, MicroStrategy, Business Objects, Microsoft, Oracle, IBM,
and others produce commercial products that enable data from the presen-
tation database to be aggregated and presented within user applications.

DATA WAREHOUSING ARCHITECTURES AND SKILL SETS 15

This is a generalized architecture, and actual implementations vary in the
details. One major variation on the Kimball architecture is the Inmon
architecture (Inmon 2005), which inserts a layer of subject-specific data
marts that contain subsets of the data from the main warehouse. Each data
mart supports only the specific end-user applications that are relevant to
the business subject area for which that mart was designed. Regardless of
your preferences for Kimball- versus Inmon-style architectures, and of the
variations found in implementation detail, Figure 1.1 will serve as reference
architecture for the discussions in this book. The Agile DW/BI principles
and practices that are introduced here are not specific to any particular
architecture.

Diverse and Disparate Technical Skills

Inherent in the implementation of this architecture are the following aspects
of development, each requiring a unique set of development skills:

� Data modeling. Design and implementation of data models are
required for both the integration and presentation repositories.
Relational data models are distinctly different from dimensional
data models, and each has unique properties. Moreover, relational
data modelers may not have dimensional modeling expertise and
vice versa.

� ETL development. ETL refers to the extraction of data from source
systems into staging, the transformations necessary to recast source
data for analysis, and the loading of transformed data into the pre-
sentation repository. ETL includes the selection criteria to extract
data from source systems, performing any necessary data transfor-
mations or derivations needed, data quality audits, and cleansing.

� Data cleansing. Source data is typically not perfect. Furthermore,
merging data from multiple sources can inject new data quality
issues. Data hygiene is an important aspect of data warehouse that
requires specific skills and techniques.

� OLAP design. Typically data warehouses support some variety of
online analytical processing (HOLAP, MOLAP, or ROLAP). Each
OLAP technique is different but requires special design skills to bal-
ance the reporting requirements against performance constraints.

� Application development. Users commonly require an applica-
tion interface into the data warehouse that provides an easy-to-use
front end combined with comprehensive analytical capabilities, and
one that is tailored to the way the users work. This often requires

16 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

some degree of custom programming or commercial application
customization.

� Production automation. Data warehouses are generally designed for
periodic automated updates when new and modified data is slurped
into the warehouse so that users can view the most recent data avail-
able. These automated update processes must have built-in fail-over
strategies and must ensure data consistency and correctness.

� General systems and database administration. Data warehouse
developers must have many of the same skills held by the typical
network administrator and database administrator. They must
understand the implications of efficiently moving possibly large vol-
umes of data across the network, and the issues of effectively storing
changing data.

WHY DO WE NEED AGILE ANALYTICS?
In my years as a DW/BI consultant and practitioner I have learned three
consistent truths: Building successful DW/BI systems is hard; DW/BI devel-
opment projects fail very often; and it is better to fail fast and adapt than to
fail late after the budget is spent.

First Truth: Building DW/BI Systems Is Hard

If you have taken part in a data warehousing project, you are aware of the
numerous challenges, perils, and pitfalls. Ralph Kimball, Bill Inmon, and
other DW/BI pioneers have done an excellent job of developing reusable
architectural patterns for data warehouse and DW/BI implementation. Soft-
ware vendors have done a good job of creating tools and technologies to
support the concepts. Nonetheless, DW/BI is just plain hard, and for several
reasons:

� Lack of expertise. Most organizations have not previously built a
DW/BI system or have only limited experience in doing so.

� Lack of experience. Most organizations don’t build multiple DW/BI
systems, and therefore development processes don’t get a chance to
mature through experience.

� Ambitious goals. Organizations often set out to build an enterprise
data warehouse, or at least a broad-reaching data mart, which makes
the process more complex.

� Domain knowledge versus subject matter expertise. DW/BI prac-
titioners often have extensive expertise in business intelligence
but not in the organization’s business domain, causing gaps in

WHY DO WE NEED AGILE ANALYTICS? 17

understanding. Business users typically don’t know what they can,
or should, expect from a DW/BI system.

� Unrealistic expectations. Business users often think of data ware-
housing as a technology-based plug-and-play application that will
quickly provide them with miraculous insights.

� Educated user phenomenon. As users gain a better understanding of
data warehousing, their needs and wishes change.

� Shooting the messenger. DW/BI systems are like shining a bright
light in the attic: You may not always like what you find. When the
system exposes data quality problems, business users tend to dis-
trust the DW/BI system.

� Focus on technology. Organizations often view a DW/BI system
as an IT application rather than a joint venture between business
stakeholders and IT developers.

� Specialized skills. Data warehousing requires an entirely different
skill set from that of typical database administrators (DBAs) and
developers. Most organizations do not have staff members with
adequate expertise in these areas.

� Multiple skills. Data warehousing requires a multitude of unique
and distinct skills such as multidimensional modeling, data cleans-
ing, ETL development, OLAP design, application development, and
so forth.

These unique DW/BI development characteristics compound the already
complex process of building software or building database applications.

Second Truth: DW/BI Development Projects Fail Often

Unfortunately, I’m not the only one who has experienced failure on DW/
BI projects. A quick Google search on “data warehouse failure polls” results
in a small library of case studies, postmortems, and assessment articles.
Estimated failure rates of around 50 percent are common and are rarely
disputed.

When I speak to groups of business intelligence practitioners, I often
begin my talks with an informal survey. First I ask everyone who has been
involved in the completion of one or more DW/BI projects to stand. It var-
ies depending on the audience, but usually more than half the group stands
up. Then I ask participants to sit down if they have experienced projects
that were delivered late, projects that had significant budget overruns, or
projects that did not satisfy users’ expectations. Typically nobody is left
standing by the third question, and I haven’t even gotten to questions about

18 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

acceptable quality or any other issues. It is apparent that most experienced
DW/BI practitioners have lived through at least one project failure.

While there is no clear definition of what constitutes “failure,” Sid Adelman
and Larissa Moss classify the following situations as characteristic of limited
acceptance or outright project failure (Moss and Adelman 2000):

� The project is over budget.
� The schedule has slipped.
� Some expected functionality was not implemented.
� Users are unhappy.
� Performance is unacceptable.
� Availability of the warehouse applications is poor.
� There is no ability to expand.
� The data and/or reports are poor.
� The project is not cost-justified.
� Management does not recognize the benefits of the project.

In other words, simply completing the technical implementation of a data
warehouse doesn’t constitute success. Take another look at this list. Nearly
every situation is “customer”-focused; that is, primarily end users deter-
mine whether a project is successful.

There are literally hundreds of similar evaluations of project failures, and
they exhibit a great deal of overlap in terms of root causes: incorrect require-
ments, weak processes, inability to adapt to changes, project scope misman-
agement, unrealistic schedules, inflated expectations, and so forth.

Third Truth: It Is Best to Fail Fast and Adapt

Unfortunately, the traditional development model does little to uncover
these deficiencies early in the project. As Jeff DeLuca, one of the creators
of Feature Driven Development (FDD), says, “We should try to break the
back of the project as early as possible to avoid the high cost of change later
downstream.” In a traditional approach, it is possible for developers to plow
ahead in the blind confidence that they are building the right product, only
to discover at the end of the project that they were sadly mistaken. This is
true even when one uses all the best practices, processes, and methodologies.

What is needed is an approach that promotes early discovery of project
peril. Such an approach must place the responsibility of success equally on
the users, stakeholders, and developers and should reward a team’s ability to
adapt to new directions and substantial requirements changes.

WHY DO WE NEED AGILE ANALYTICS? 19

As we observed earlier, most classes of project failure are user-satisfaction-
oriented. If we can continuously adapt the DW/BI system and align with
user expectations, users will be satisfied with the outcome. In all of my past
involvement in traditional DW/BI implementations I have consistently seen
the following phenomena at the end of the project:

� Users have become more educated about BI. As the project pro-
gresses, so does users’ understanding of BI. So, what they told you
at the beginning of the project may have been based on a misunder-
standing or incorrect expectations.

� User requirements have changed or become more refined. That’s
true of all software and implementation projects. It’s just a fact of
life. What they told you at the beginning is much less relevant than
what they tell you at the end.

� Users’ memories of early requirements reviews are fuzzy. It often
happens that contractually speaking, a requirement is met by the
production system, but users are less than thrilled, having reactions
like “What I really meant was . . .” or “That may be what I said, but
it’s not what I want.”

� Users have high expectations when anticipating a new and use-
ful tool. Left to their own imaginations, users often elevate their
expectations of the BI system well beyond what is realistic or reason-
able. This only leaves them disappointed when they see the actual
product.

� Developers build based on the initial snapshot of user require-
ments. In waterfall-style development the initial requirements are
reviewed and approved, then act as the scoping contract. Meeting
the terms of the contract is not nearly as satisfying as meeting the
users’ expectations.

All these factors lead to a natural gap between what is built and what is
needed. An approach that frequently releases new BI features to users, hears
user feedback, and adapts to change is the single best way to fail fast and
correct the course of development.

Is Agile Really Better?

There is increasing evidence that Agile approaches lead to higher project
success rates. Scott Ambler, a leader in Agile database development and
Agile Modeling, has conducted numerous surveys on Agile development in
an effort to quantify the impact and effectiveness of these methods. Begin-
ning in 2007, Ambler conducted three surveys specifically relating to IT

20 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

project success rates.6 The 2007 survey explored success rates of different IT
project types and methods. Only 63 percent of traditional projects and data
warehousing projects were successful, while Agile projects experienced a 72
percent rate of success. The 2008 survey focused on four success criteria:
quality, ROI, functionality, and schedule. In all four areas Agile methods
significantly outperformed traditional, sequential development approaches.
The 2010 survey continued to show that Agile methods in IT produce better
results.

I should note here that traditional definitions of success involve metrics
such as on time, on budget, and to specification. While these metrics may
satisfy management efforts to control budgets, they do not always correlate
to customer satisfaction. In fact, scope, schedule, and cost are poor mea-
sures of progress and success. Martin Fowler argues, “Project success is
more about whether the software delivers value that’s greater than the cost
of the resources put into it.” He points out that XP 2002 conference speaker
Jim Johnson, chairman of the Standish Group, observed that a large propor-
tion of features are frequently unused in software products. He quoted two
studies: a DuPont study, which found that only 25 percent of a system’s fea-
tures were really needed, and a Standish study, which found that 45 percent
of features were never used and only 20 percent of features were used often
or always (Fowler 2002). These findings are further supported by a Depart-
ment of Defense study, which found that only 2 percent of the code in $35.7
billion worth of software was used as delivered, and 75 percent was either
never used or was canceled prior to delivery (Leishman and Cook 2002).

Agile development is principally aimed at the delivery of high-priority value
to the customer community. Measures of progress and success must focus
more on value delivery than on traditional metrics of on schedule, on bud-
get, and to spec. Jim Highsmith points out, “Traditional managers expect
projects to be on-track early and off-track later; Agile managers expect
projects to be off-track early and on-track later.” This statement reflects
the notion that incrementally evolving a system by frequently seeking and
adapting to customer feedback will result in building the right solution, but
it may not be the solution that was originally planned.

The Difficulties of Agile Analytics

Applying Agile methods to DW/BI is not without challenges. Many of the
project management and technical practices I introduce in this book are

6. The detailed results are available at www.ambysoft.com/surveys/.

WHY DO WE NEED AGILE ANALYTICS? 21

adapted from those of our software development colleagues who have been
maturing these practices for the past decade or longer. Unfortunately, the
specific practices and tools used to custom-build software in languages like
Java, C++, or C# do not always transfer easily to systems integration using
proprietary technologies like Informatica, Oracle, Cognos, and others.
Among the problems that make Agile difficult to apply to DW/BI develop-
ment are the following:

� Tool support. There aren’t many tools that support technical prac-
tices such as test-driven database or ETL development, database
refactoring, data warehouse build automation, and others that are
introduced in this book. The tools that do exist are less mature than
the ones used for software development. However, this current state
of tool support continues to get better, through both open-source as
well as commercial tools.

� Data volume. It takes creative thinking to use lightweight devel-
opment practices to build high-volume data warehouses and BI
systems. We need to use small, representative data samples to
quickly build and test our work, while continuously proving that
our designs will work with production data volumes. This is more of
an impediment to our way of approaching the problem rather than
a barrier that is inherent in the problem domain. Impediments are
those challenges that can be eliminated or worked around; barriers
are insurmountable.

� “Heavy lifting.” While Agile Analytics is a feature-driven (think
business intelligence features) approach, the most time-consuming
aspect of building DW/BI systems is in the back-end data warehouse
or data marts. Early in the project it may seem as if it takes a lot of
“heavy lifting” on the back end just to expose a relatively basic BI
feature on the front end. Like the data volume challenge, it takes
creative thinking to build the smallest/simplest back-end data solu-
tion needed to produce business value on the front end.

� Continuous deployment. The ability to deploy new features into
production frequently is a goal of Agile development. This goal is
hampered by DW/BI systems that are already in production with
large data volumes. Sometimes updating a production data ware-
house with a simple data model revision can require significant time
and careful execution. Frequent deployment may look very different
in DW/BI from the way it looks in software development.

The nuances of your project environment may introduce other such diffi-
culties. In general, those who successfully embrace Agile’s core values and

22 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

guiding principles learn how to effectively adapt their processes to mitigate
these difficulties. For each of these challenges I find it useful to ask the ques-
tion “Will the project be better off if we can overcome this difficulty despite
how hard it may be to overcome?” As long as the answer to that question is
yes, it is worth grappling with the challenges in order to make Agile Ana-
lytics work. With time and experience these difficulties become easier to
overcome.

INTRODUCING FLIXBUSTER ANALYTICS
Now seems like a good time to introduce the running DW/BI example that
I’ll be revisiting throughout this book to show you how the various Agile
practices are applied. I use an imaginary video rental chain to demonstrate
the Agile Analytics practices. The company is FlixBuster, and they have
retail stores in cities throughout North America. FlixBuster also offers video
rentals online where customers can manage their rental requests and mov-
ies are shipped directly to their mailing address. Finally, FlixBuster offers
movie downloads directly to customers’ computers.

FlixBuster has customers who are members and customers who are non-
members. Customers fall into three buying behavior groups: those who shop
exclusively in retail stores, those who shop exclusively online, and those who
split their activity across channels. FlixBuster customers can order a rental
online or in the store, and they can return videos in the store or via a post-
age-paid return envelope provided by the company.

Members pay a monthly subscription fee, which determines their rental
privileges. Top-tier members may rent up to three videos at the same time.
There is also a membership tier allowing two videos at a time as well as a tier
allowing one at a time. Members may keep their rentals indefinitely with no
late charges. As soon as FlixBuster receives a returned video from a member,
the next one is shipped. Nonmembers may also rent videos in the stores fol-
lowing the traditional video rental model with a four-day return policy.

Approximately 75 percent of the brick-and-mortar FlixBuster stores across
North America are corporately owned and managed; the remaining 25 per-
cent are privately owned franchises. FlixBuster works closely with franchise
owners to ensure that the customer experience is consistent across all stores.
FlixBuster prides itself on its large inventory of titles, the rate of customer
requests that are successfully fulfilled, and how quickly members receive
each new video by mail.

WRAP-UP 23

FlixBuster has a complex partnership with the studios producing the films
and the clearinghouses that provide licensed media to FlixBuster and man-
age royalty payments and license agreements. Each title is associated with
a royalty percentage to be paid to the studio. Royalty statements and pay-
ments are made on a monthly basis to each of the clearinghouses.

Furthermore, FlixBuster sales channels (e-tail and retail) receive a per-
centage of the video rental revenue. Franchise owners receive a negotiated
revenue amount that is generally higher than for corporately owned retail
outlets. The online channel receives still a different revenue percentage to
cover its operating costs.

FlixBuster has determined that there is a good business case for develop-
ing an enterprise business intelligence system. This DW/BI system will serve
corporate users from finance, marketing, channel sales, customer man-
agement, inventory management, and other departments. FlixBuster also
intends to launch an intranet BI portal for subscription use by its clearing-
house partners, studios, franchisees, and possibly even Internet movie data-
base providers. Such an intranet portal is expected to provide additional
revenue streams for FlixBuster.

There are multiple data sources for the FlixBuster DW/BI system, includ-
ing FlixBackOffice, the corporate ERP system; FlixOps, the video-by-mail
fulfillment system; FlixTrans, the transactional and point-of-sale system;
FlixClear, the royalty management system; and others.

FlixBuster has successfully completed other development projects using
Agile methods and is determined to take an Agile Analytics approach on the
development of its DW/BI system, FlixAnalysis. During high-level executive
steering committee analysis and reviews, it has been decided that the first
production release of FlixAnalysis will be for the finance department and
will be a timeboxed release cycle of six months.

WRAP-UP
This chapter has laid the foundation for an accurate, if high-level, under-
standing of Agile Analytics. Successive chapters in this book serve to fill
in the detailed “how-to” techniques that an Agile Analytics team needs to
put these concepts into practice. You should now understand that Agile
Analytics isn’t simply a matter of chunking tasks into two-week iterations,
holding a 15-minute daily team meeting, or retitling the project manager a
“scrum master.” Although these may be Agile traits, new Agile teams often

24 CHAPTER 1 � INTRODUCING AGILE ANALYTICS

limit their agility to these simpler concepts and lose sight of the things that
truly define agility. True agility is reflected by traits like early and frequent
delivery of production-quality, working BI features, delivering the highest-
valued features first, tackling risk and uncertainty early, and continuous
stakeholder and developer interaction and collaboration.

Agile Analytics teams evolve toward the best system design by continu-
ously seeking and adapting to feedback from the business community. Agile
Analytics balances the right amount of structure and formality against a
sufficient amount of flexibility, with a constant focus on building the right
solution. The key to agility lies in the core values and guiding principles
more than in a set of specific techniques and practices—although effective
techniques and practices are important. Mature Agile Analytics teams ele-
vate themselves above a catalog of practices and establish attitudes and pat-
terns of behavior that encourage seeking feedback, adapting to change, and
delivering maximum value.

If you are considering adopting Agile Analytics, keep these core values and
guiding principles at the top of your mind. When learning any new tech-
nique, it is natural to look for successful patterns that can be mimicked.
This is a valuable approach that will enable a new Agile team to get on the
right track and avoid unnecessary pitfalls. While I have stressed that Agile
development is not a prescriptive process, new Agile teams will benefit from
some recipe-style techniques. Therefore, many of the practices introduced
in this book may have a bit of a prescriptive feel. I encourage you to try these
practices first as prescribed and then, as you gain experience, tailor them
as needed to be more effective. But be sure you’re tailoring practices for the
right reasons. Be careful not to tailor a practice simply because it was diffi-
cult or uncomfortable on the first try. Also, be sure not to simply cherry-pick
the easy practices while ignoring the harder ones. Often the harder practices
are the ones that will have the biggest impact on your team’s performance.

315

Agile Adoption Patterns (Elssamadisy), 300
Agile Alliance

guiding principles of, 9–10
overview of, 6

Agile Analytics, introduction to
Agile approach to developing DW/BI systems,

4–7
challenges of applying Agile methods to DW/BI,

20–22
data warehousing architectures and, 13–16
difficulty of building DW/BI systems, 16–17
fail fast and adapt approach, 18–19
FlixBuster example, 22–23
frequent failure of DW/BI development projects,

17–18
guiding principles, 9–10
myths and misconceptions, 10–13
overview of, 3
relating Agile approach to success rate, 19–20
summary (wrap up) of, 23–24
what it is, 7–9
what the term means, xxvi–xxvii
why it is needed, 16

Agile Best Practice (Ambler), 108
Agile Data Warehousing (Hughes), 306, xxv
Agile Database Techniques (Ambler), 226
Agile, defined, 3
Agile Manifesto, 6
Agile Model Driven Development (AMDD), 33
Agile Modeling (Ambler), 151
Agile Project Leadership Network (APLN), 302
Agile Project Management. See APM (Agile Project

management)
Agile Project Management (Highsmith), 39
Agreements, working agreements required by self-

organizing teams, 130–131
Aguirre, Ricardo, xxxiii
Ambler, Scott, 19–20, 31, 33, 40, 44, 72–73, 91–92,

108, 146, 151, 158, 162–163, 165–166, 168, 194,
212, 226, xxv, xxxiv

Numbers
90-day (six-iteration) planning cycle, dividing proj-

ect plan into iterations, 88–89

A
Acceptance testing. See also Functional testing

in Agile testing framework, 199
for failed DW/BI project, xxi
key perspectives in testing software and systems,

197–198
process under test, 204
in traditional development, 193
WatiN utility for, 195
in waterfall development, 31

Accountability, of teams, 128
Accuracy, traits of Agile modeling, 150
Adaptive Object Modeling (AOM), 153–154
Adaptive Object Modeling (Yoder and Johnson), 190
Adaptive Software Development (Highsmith), 5
Adkins, Lyssa, 67, 69, 303
ADM (adaptive data model), 179

in architecture of message-driven warehouse,
188–189

creating adaptive warehouses, 190
SOR (System of Record) database built on, 177,

179
use of design patterns in, 153–154

Administrative skills, for implementing DW/BI
systems, 16

Adoption strategies
expecting some chaos while making change,

300–301
goals and business alignment and, 302–303
leadership responsibilities regarding, 302
measuring success of, 305
overview of, 299–300
road map for, 303
training and coaching in, 303–305

Adzic, Gojko, 206

INDEX

316 INDEX

AMDD (Agile Model Driven Development), 33
Analysis Patterns (Fowler), 152
Analysis tier, in data warehousing architecture, 13–14
Ancillary members, of Agile community, 66–67
AOM (Adaptive Object Modeling), 153–154
APLN (Agile Project Leadership Network), 302
APM (Agile Project Management)

changing role of, 35–36
colocation of teams, 44–45

envision phase, 32–33
explore phase, 33–35

just enough design, 39–40
making sense of varieties (flavors) of Agile devel-

opment, 36–39
monitoring feature completion not task time,

54–56
overview of, 25–26
phased-sequential DW/BI development, 30–32
planning to capacity and monitoring velocity of

work, 46–49
scenario, 27–30
summary (wrap up) of, 56–57
synchronization on daily basis, 41
technical debt and, 45–46
timeboxing, 42–44
tracking progress on daily basis, 49–53
what it is, 26

Application development, skills needed for imple-
menting DW/BI systems, 15–16

Architectural sketch, for envisioning user stories,
103–104

Architectures
data warehouse, 13–15
key testing points in data warehouse architec-

ture, 209–211
message-driven warehouse, 177–179
precursors for Agile projects, 81
technical skills needed for implementing DW,

15–16
Audit trails, benefits of version control, 227
Authentication, scripting, 285
Automation

build automation. See Build automation
characteristics of Agile Analytics, 9
of production in data warehousing, 16
project automation. See Projects, automating
test automation. See Test automation
types of, 260–261

B
Back end systems, development of, 21
Backlog. See Product backlog
Bad news, suppression of, 134
BDUF (Big Design Up Front)

costs and risks of, 144
just enough design as alternative to, 40

Beck, Kent, 37, 152, 205, 215, 221
Beedle, Mike, 38
Being Agile vs. Doing Agile, 293–296
BI (business intelligence). See also DW/BI (data

warehousing/business intelligence)
advanced BI techniques, 298
automating testing in, 201–203
black box technologies for testing, 211
comparing traditional BI systems with Agile

systems, 4–5
development, preproduction, and production

environments, 211–212
focusing on early and continuous delivery of BI

features, 291–292
guidelines for BI testing, 220–221
increasing demand for operational BI, 173
performance testing, 200–201
testing process for, 203–205
user stories for. See User stories
what the term means, xxvi

Big Design Up Front (BDUF)
costs and risks of, 144
just enough design as alternative to, 40

Bilateral commitments, in co-dev user groups, 75
Black box technologies, for BI testing, 211
Blaha, Michael, 152
Blue-green deployment, of warehouse changes,

169–170
BPM (business performance management), 173
Branching

creating release branches, 260
keeping things simple, 251–252
naming branches, 248–249
standards for, 245
tagging branches, 246
version control capabilities and, 237–238
when to branch, 245–248

Brand, Stewart, 148
Brink, David, xxxiii
Brooks, Fred, 80, 132, 295, xxi
Bug-tracking system, 229

INDEX 317

Bugs, tagging, 248, 250, 252
Build automation

advanced, 267–268
build frequency, 275–276
defining the build tasks, 270–271
defining the directory structure, 269–270
defining the project, 268–269
defining the testing tasks, 271–273
overview of, 262–263
rudimentary, 264–267
scheduling builds, 276
selecting version control tools, 254
triggering builds, 277
when to start, 274

build/, in project directory structure, 241
Build scripts, in release package, 284
Build tasks, defining, 270–271
BUILDING file, in repository, 241
Bundling releases, 283–284
Burn-down chart

in collaboration session, 63
tracking progress on daily basis, 51–53

Business acceptability, Marick’s perspectives for
acceptance testing, 197

Business activities, parking lot diagrams and, 117
Business alignment, adoption strategies and,

302–303
Business performance management (BPM), 173

C
Capability testing, in Agile testing framework, 199
Capacity

planning to capacity, 46–49
vs. wish-based planning, 49

Card wall, tracking progress on daily basis, 51–52
Ceremony, minimizing in collaborative sessions, 72
Change

adapting to, 294
avoiding high costs of, 297
deploying warehouse changes, 167–169
expecting some chaos while adopting Agile

approach, 300–301
response to requiring collaboration, 59

Chaos, expecting during adoption process, 300–301
checkout command, working with files, 233
CI. See Continuous integration

Clark, Mike, 262
Cloud computing, enhanced by Agile development,

298–299
CM (code management) repository, 212–214
Co-development user group, attributes of, 74–76
Coaching, as part of adoption strategy, 303–305
Coad, Peter, 38, 117
CoC (cost of change), managing technical debt and,

155
Cockburn, Alistair, 38, 71, 96
Code management (CM) repository, 212–214
Code smells (Fowler)

data warehouse smells and refactoring, 163–164
database smells and refactoring, 162–163
overview of, 162

Code, storing in version control repository, 231
Cohn, Mike, 85, 86, 91, 111–112, 299
Collaboration. See also Community

characteristics of Agile Analytics, 9
consumer collaboration, 73–76
as a continuum of trust, 67–69
doer collaboration, 77–78
FlixBuster scenario, 61–64
leadership facilitating, 302
mechanics of, 69–73
not short-circuiting customer collaboration, 294
overview of, 59–60
planner collaboration, 78–79
precursors for Agile projects and, 80–81
summary (wrap up) of, 82
what it is, 60–61

Collegial membership, in co-dev user groups, 75
Collier, Ken, 194, xv, xviii, xxxv
Colocation of teams, 44–45
commit command, working with files, 233
Commitments

bilateral commitment in co-dev user groups, 75
customer commitment in co-development, 76
honoring commitments required by self-organiz-

ing teams, 132–133
to iteration plan, 146
precursors for Agile projects, 81

Communication
face-to-face communication facilitated by colo-

cation of teams, 44
synchronous vs. asynchronous, 72
traits of Agile modeling, 149–150

318 INDEX

Community. See also Collaboration
core group in, 66
diversity as challenge facing data warehousing,

171–172
diversity in consumer community, 73
identifying and filling roles, 67
members of, 64–65
multiple roles of members, 65–66
what it is, 60–61

Complete, Repeatable, Informative, Schedulable,
Portable (CRISP), 262–263, 266

Complexity
facing gnarly problems, 296–297
simplifying complex Epics, 102

Compliance
corporate regulations and, 137
group interaction and, 68

Conceptual (reference) data model, consistency
with, 144–146

Configuration settings, storing in version control
repository, 231

Conflict resolution, version control and, 238–240
Consadine, Phil, 306
Consistency, traits of Agile modeling, 150
Consumer collaboration

characteristics of effective, 74
not short-circuiting,

294–295

overview of, 73–74

Consumers, in Agile community, 65
Continuous Delivery (Farley and Humble), 169
Continuous integration

emerging technologies impacting, 299
overview of, 261, 274–275
sandbox for, 259
scripts in release package, 284
setting up, 277–281

Continuum of trust, compliance to cooperation to
collaboration, 67–69

Controls, functional testing of user controls, 223
Conventions, compared with patterns, 152–153
Cooperation, relationship to collaboration, 67–68
Coordination, synchronization of work by teams on

daily basis, 41
Core group, in Agile community, 66
Core values, glass-house development and, 135
Corporate alignment

aligning business to adoption strategy, 302–303
required by self-organizing teams, 136–137

Corporate Information Factory (Inmon), 177
Cost of change (CoC), managing technical debt and,

155
Cost of sales (CoS), calculating, 145–146
Costs, in scope, schedule, and cost triangle, 41, 43
Cox, Simon, 180
The Craft of Software Testing (Marick), 197
CRISP (Complete, Repeatable, Informative, Sched-

ulable, Portable), 262–263, 266
Critical members, of Agile community, 66–67
Crystal Methods, f lavors of Agile development, 38
Cunningham, Ward, 37, 45, 154–155, 205
Customers. See Users
Cut line, on prioritized backlog, 116
CVS

release tags, 245
storing code in CM repository, 212
version control tool, 251

D
Data adapters

in message-driven warehouse, 184–187
metadata-driven, 179

Data archive, scripting, 284
Data boundaries, epic-splitting approaches, 101
Data-centric approach, vs. user stories, 85
Data cleansing, skills needed for implementing DW/

BI systems, 15
Data definition language (DDL) scripts, version

control and, 228
data/, in project directory structure, 242
Data loading/reloading

challenges facing data warehousing systems, 174
scripting, 285

Data migration
facing gnarly problems, 297
scripting, 285

Data mining, advanced BI techniques, 298
Data Model Patterns: A Metadata Map (Hay), 152
Data Model Patterns: Conventions of Thought (Hay), 152
Data modeling

adaptive. See ADM (adaptive data model)
changes and, 40
just-in-time, 91
maintaining consistency with conceptual model,

144–145
patterns for, 152–154

INDEX 319

skills needed for implementing DW/BI systems, 15
in waterfall development approach, 30

Data set, in BI testing process, 203
Data sources, challenges facing data warehousing

systems, 172–173
Data volume, difficulties of building DW/BI sys-

tems, 21
Data warehouse architecture, adaptive design

ADM (adaptive data model), 188–189
architectural overview, 177–179
data adapter, 184–187
message bus for pushing data,

182–184

OMM (Observation Message Model), 179–182
overview of, 174–175
product evolution, 175–177
SOR (System of Record) database, 187–188
warehouse repository, 184

Data Warehouse Bus (Kimball and Ross), 177
Data warehousing. See also DW/BI (data warehousing/

business intelligence)
benefits of evolutionary design, 146–147
conceptual architectures, 13–15
data-centric approach to, 85
deploying warehouse changes, 167–169
key testing points in, 209–211
message-driven warehouse, 175
new demands facing, 171–174
repository for, 184
skills needed for implementation of architec-

tures, 15–16
test-driven development. See test-driven

development

what the term means, xxvi

The Data Warehousing Institute (TDWI), 306, xvii,
xxxv

Databases
deployment sequence for, 168–169
evolutionary development of, 296–297
refactoring, 165–167
testing operational, 210
testing tools for, 205–209
versioning, 170–171

Daugherty, Robert, xxxiii
db/, in project directory structure, 242
DBAs (database administrators), skills needed for

implementing DW/BI systems, 16
DbFit

as database testing tool, 206–207

for test automation, 216
for version control, 228

DDL (data definition language) scripts, version
control and, 228

De Luca, Jeff, 38
Decision making

in collaborative sessions, 72
by groups, 123–125

Defects, tagging, 246
Delivery, frequent delivery as Agile principle, 81
DeLuca, Jeff, 18, 117
Deployment

always be ready to deploy, 171
blue-green deployment, 169–170
continuous, 21
optimizing deployment time, 225
storing deployment scripts in version control

repository, 232
types of automation, 261
of warehouse changes, 167–169

Design
evolutionary. See Evolutionary design
just enough design as tenet of Agility, 39–40
patterns, 152–154

Design Patterns (Gamma, et al.), 152
Desktop sharing, for virtual colocation, 77–78
Detail level, traits of Agile modeling, 150
Developers, evolutionary design practices for, 147
Development

Agile Analytics as style rather than methodology,
5, 293

environment for, in BI systems, 211
Directory structure

defining in build automation, 269–270
for version control, 241–245

Discipline, self-organization requiring self-discipline,
127

Distractions, eliminating from collaborative ses-
sions, 71

Do less practice, in Agile development, 107
doc/, in project directory structure, 241–242
Documentation

of collaborative sessions, 72
storing in version control repository, 231

Doers
in Agile community, 65
doer collaboration, 77–78

Done! at completion of testing process, 133, 194

320 INDEX

Done! Done! user acceptance and, 133, 194
Drive: The Surprising Truth about What Motivates Us

(Pink), 122
DSDM (Dynamic Systems Development Method)

Consortium, 42

DW/BI (data warehousing/business intelligence)

Agile Analytics tailored to fit DW/BI projects, 3
Agile approach to developing DW/BI systems, 4–7
business intelligence. See BI (business

intelligence)
challenges of applying Agile methods to, 20–22
data warehousing. See data warehousing
difficulty of building DW/BI systems, 16–17
focusing on early and continuous delivery of

features, 291–292
frequent failure of DW/BI projects, 17–18,

xix–xxii
phased-sequential development as approach to

project management, 30–32
technologies, xxxi
testing in. See TDD (test-driven development)
what the term means, xxvi

Dyche, Jill, 306
Dynamic Systems Development Method (DSDM)

Consortium, 42

E
Eckerson, Wayne, 292, xvii–xviii, xxxiv
Elssamadisy, Amr, 300
Envision®Explore cycle

envision phase, 32–33
explore phase, 33–35

Envisioning process
for architecture design, 149
in Envision®Explore cycle, 32–33
in FlixBuster scenario, 27–28
for message-driven warehouse, 176

Epics
as collection of related stories, 99–100
epic-splitting approaches, 100–101
removing risk and uncertainty, 102
simplifying complex, 102

Essential members, of Agile community, 66–67
Estimating story-point, 88–89, 111–112
ETL (extraction, transformations, loading)

development

development skills needed for implementing
DW/BI systems, 15

in waterfall development approach, 31
etl/, in project directory structure, 242
Event f lows

finding user stories in, 98
use-case modeling and, 96

Evolutionary design
Agile modeling, 149–151
blue-green deployment, 169–170
case study. See Data warehouse architecture,

adaptive design
data model patterns, 152–154
database versioning, 170–171
deploying warehouse changes, 167–169
determining amount of up-front design, 148–149
developer practices, 147
facing gnarly problems, 296
how to refactor, 165–167
managing technical debt, 154–157
overview of, 141–144
reasons for using, 171–174
refactoring and, 157–162
scenario illustrating, 144–146
summary (wrap up) of, 189–191
what it is, 144, 146–147
when to refactor, 162–164

Excel

storing product backlog on, 107–108
tracking progress on daily basis, 53

Expectations
mismatched, 31
stakeholder showcases for aligning, 79–80

Experimental branches
naming, 248
uses of branching, 247

Exploration factor, 78
Exploratory testing, in Agile testing framework, 199
Explore phase, in Envision®Explore cycle, 33–35
Extraction, transformations, loading (ETL)

development
development skills needed for implementing

DW/BI systems, 15
in waterfall development approach, 31

eXtreme Programming. See XP (eXtreme
Programming)

Extreme Programming Explained (Beck), 37

INDEX 321

F
Face-to-face communication

in collaboration, 72
colocation of teams facilitating, 44
in doer collaboration, 77

Fail fast and adapt approach, 18–19
Failure

characteristics of project failure, 18
lack of success in DW/BI development, xix–xxii
project failure statistics, 291–292
project success/failure measured by user satisfac-

tion, 18
Farley, David, 169–171
FDD (Feature Driven Development)

flavors of Agile development, 38–39
overview of, 18
parking lot diagrams, 117

Features
focusing on early and continuous delivery of BI

features, 291–292
iterations ending with feature review, 34
parking lot diagrams and, 117
showcase for, 260

Files
build specification, 268
conflict resolution, 238–240
explanatory files in repository, 241
properties files, 270
working with, 233–235

Fit framework, for test automation, 205
FitNesse, for database testing, 206–207
Flexibility, balancing with structure, 137
Flip charts, in story-writing workshop, 87
FlixBuster example, 22–23
Fowler, Martin, 20, 152, 157, 162, 180
Functional testing. See also Acceptance testing

in Agile testing framework, 199
testing content, 223
testing user controls, 222–223

G
Gat, Israel, 155–156, xxxiv
Gilbert, Scott, xxxiii
Glass-house collaboration, 71
Glass-house development, 134–136
GLOSSARY file, in repository, 241

Goals
adoption strategies and, 302–303
considering in story-writing workshop, 86–87
use-case modeling, 96

Golden, Jonathon, xxxiv
Governance, corporate regulations and, 137
Groups, decision making by, 123–125

H
Hangovers, 133–134
Hay, David, 152–154
Highsmith, Jim, 5, 26, 39, 49, 57, 68–69, 77, 136, 156,

222, 302, xv–xvi, xxii–xxiii, xxxiii–xxxiv
Hitchman, Steve, 306
Hohmann, Luke, 102, xxxiii
How Buildings Learn (Brand), 148
Hughes, Ralph, 86, 306, xxv, xxxiv
Humble, Jez, 169–171

I
Imhoff, Claudia, 306
Implementation Patterns (Beck), 152
Information radiators (Cockburn), 71
Infrastructure, precursors for Agile projects, 79–80
Inmon, Bill, 13, 15–16, 177, 191
Inmon-style architecture, for data warehousing, 13,

15
Installation, automation of, 261
Instant messaging, for virtual colocation, 77
Integration testing

failing tests, 122
overview of, 195

Integration tier, in data warehousing architecture,
13–14

Iron triangle planning, Being Agile vs. Doing Agile,
293–294

Iteration zero
adopting testing tools and methods during, 222
overview of, 82
planning in FlixBuster scenario, 29

Iterations
dividing project plan into, 88–89
meeting commitments and, 132–133
misconceptions regarding Agile development

and, 11–12

322 INDEX

Iterations (continued )
planning iteration zero in FlixBuster scenario, 29
planning sessions for, 34
planning to capacity, 47
retrospective, 63–64
tagging end of, 246, 248
timeboxing, 43
tracking progress on daily basis, 50–51

Iterative, incremental, evolutionary development
characteristics of Agile Analytics, 7–8
project management approach to, 26

J
Jankovsky, Bob, 154
Jeffries, Ron, 37, 40
JUnit framework, for test automation, 205
Just enough design, tenets of project management,

39–40
Just-in-time data modeling (Ambler), 91
Just-in-time warehouse design, 146

K
Kerievsky, Josh, 152
Kimball, Ralph, 13, 15–16, 154, 177, 306
Kimball-style architecture, for data warehousing, 13, 15

L
Leadership responsibility, in adopting Agile

approach, 302
Load testing, 201
Locking protocols, version control and, 238–240

M
Maeda, Masa, 304
Mah, Michael, 305
Mainline, keeping development on, 251
Management

Being Agile vs. Doing Agile, 293–294
leadership responsibilities for adopting Agile

approach, 302–303
project management. See APM (Agile Project

management)
self-management, 121
traditional approach, 36

Marick, Brian, 197
Martin, Robert C., 206
Maven, 273
McKenna, Jeff, 37
mdx/, in project directory structure, 242
Meetings

limiting membership in, 70–71
qualities of effective, 69–70

Members, of Agile community
core group, 66
Identifying and filling roles, 67
multiple roles of, 65–66
overview of, 64–65

Mentoring
in adoption process, 303
in co-dev user groups, 75

Merging capabilities
keeping things simple, 252
version control capabilities, 238

Message bus, for pushing data, 177–178, 182–184
Message-driven warehouse

architectural overview, 177–179
development of, 175
product evolution, 175–177

Metadata dictionary, in message-driven warehouse,
177

Metadata, storing, 231
Metrics, measuring success of Agile adoption pro-

cess, 305
Microsoft Excel

storing product backlog on, 107–108
tracking progress on daily basis, 53

Modeling
adaptive. See ADM (adaptive data model)
data model patterns, 152–154
data modeling. See Data modeling
evolutionary design and, 144
Observation Message Model. See OMM (Obser-

vation Message Model)
principles of, 151
prioritizing backlog, 108–109
Satir Change Model, 300–301
traits of, 149–150
use-case modeling. See Use-case modeling

Monitoring
feature completion not task time, 54–56
types of automation, 261
velocity of work over against capacity, 47–49

INDEX 323

Monitoring devices, as prerequisite for project auto-
mation, 262

Moss, Larissa, 306
Motivation, factors in, 122
Mundy, Joy, 306
The Mythical Man-Month (Brooks), 132

N
NAnt build example

defining directory structure, 269–270
defining project, 268–269
defining tasks, 270–271
defining testing tasks, 271–273

Network administrators, skills needed for imple-
menting DW/BI systems, 16

Nonfunctional requirements, epic-splitting
approaches, 101

NoSQL databases, 298

O
Object-oriented programming, TDD designed for,

216
Observation Message Model. See OMM (Observa-

tion Message Model)
Observations and Measurements (Cox), 180
OLAP (online analytical processing), 15
O’Leary, Dan, xxxiii
OMM (Observation Message Model)

in creation of adaptive warehouses, 190
data adapter receiving OMM message payload,

184–187
message bus for pushing data, 182–184
in message-driven warehouse, 179–182
overview of, 178

On-demand technologies, 298–299
One-step builds, 260
Online analytical processing (OLAP), 15
Open-source software (OSS), for version control,

253
Operational boundaries, epic-splitting approaches,

101
Operational databases, as test point, 210
Optimistic locking, 239–240
osql.exe, 273
OSS (open-source software), for version control, 253

P
Package, release

bundling, 283–284
creating, 286–287
organizing, 285–286
what it contains, 284–285

Pair programming, in project automation scenario,
259

Parking lot diagrams, 117–119
Patterns

adaptive data model, 153–154
in creation of adaptive warehouses, 190
data model, 152–153
right use of, 154

Patterns of Data Modeling (Blaha), 152
Penaho, vendors offering Agile enabled technolo-

gies, 306
Performance, factors motivating, 122
Performance testing, 200–201
Personas, user roles and, 94–95
Phaal , Robert, 303–304
Phased-sequential development. See also Waterfall

development, 30–32
Pink, Daniel, 122
Plan®Do model

Envision®Explore cycle as alternative to, 32
waterfall development approach as, 30–31

Planners
in Agile community, 64
collaboration, 78–79

Planning
Iron triangle planning, 293–294
iterations, 34

Preproduction environment, in BI systems, 211
Presentation tier, in data warehousing architecture,

13–14
Principles

for Agile Analytics, 9–10
of Agile modeling, 151

Prioritization, of product backlog
backlog management, 111
capability-based, 109–110
overview of, 107–108
process of, 110
user stories and, 88
value-based, 108–109

Problem solving, facing gnarly problems, 296–297

324 INDEX

Product backlog
capability-based prioritization, 109–110
continuous backlog grooming, 111
hangovers, 133–134
managing changes in user stories, 34
prioritization of, 107–108
prioritizing user stories and, 88
updating, 63
value-based prioritization, 108–109

Product-driven development, in Agile approach, 34
Product evolution, for message-driven warehouse,

175–177
Product ownership, in co-dev user groups, 74–75
Product validation, Marick’s perspectives for accep-

tance testing, 198
Production environment, in BI systems, 211
Production quality, characteristics of Agile Analyt-

ics, 8
Productivity, emphasizing quality and value as basis

of, 294
Programatic Programmer book series (Thomas and

Hunt), 245
Progress, tracking on daily basis, 49–53
Project-chartering session, in FlixBuster scenario, 29
Project documentation. See also Documentation
Project manager, in Agile approach, 36
Projects

defining, 268–269
documenting, 231
managing. See APM (Agile Project management)
precursors for Agile projects, 80–81
proof-of-concept projects, 298

Projects, automating
build automation. See Build automation
continuous integration, 261
overview of, 257–258
prerequisites for, 261–262
push button releases. See Push button releases
scenario illustrating, 258–260
setting up continuous integration, 277–281
summary (wrap up) of, 288–290
what it is, 258, 260–261

Proof-of-concept projects, 298
Properties files, 270
Purpose

clarity of, 150
well-defined purpose as basis of collaborative

sessions, 70

Purpose Alignment Model (Pixton, et al.), 109
Push button releases

bundling, 283–284
creating release package, 286–287
keeping track of versions, 287–288
organizing the release package, 285–286
overview of, 281–282
preparing, 282–283
types of automation, 261
what goes into the release package, 284–285
what is a release, 281–282

Push reporting
demand for, 173–174
in message-driven data warehouse, 177–178,

182–184

Q
QlikView, vendors offering Agile enabled technolo-

gies, 307

R
README files, 241
Real-time analytics, advanced BI techniques, 298
Refactoring

database refactoring categories, 159
how to, 165–167
overview of, 157–158
what it is, 158–159
when to, 162–164

Refactoring Databases (Ambler and Sadalage), 162,
170

Refactoring to Patterns (Kerievsky), 152
Reference (conceptual) data model, consistency

with, 144–146
Reflection

frequency of, 73
iterations ending with, 34

Relational database, storing product backlog in,
107–108

Release branches
frequent release of production-quality software,

193
naming, 248
uses of branching, 247

Release managers, 282–283
Release teams, 282–283

INDEX 325

Releases. See also Push button releases
bundling, 283–284
controlling, 227
creating release package, 286–287
keeping track of versions, 287–288
organizing the release package, 285–286
planning calendar for, 29
preparing, 282–283
storing release scripts in version control reposi-

tory, 232
tagging, 246, 248
timeboxing release cycles, 42–43
what goes into the release package, 284–285
what it is, 281–282

Repetition, avoiding in collaborative sessions, 72
Reports, scripting user-defined, 285
Repositories

CM (code management) repository, 212–214
explanatory files in, 241
organizing, 240
revision history in, 235
storing all project artifacts in single repository,

253
storing testing frameworks in version control

repository, 273
for version control, 230
for warehouse, 184
what not to store in version control repository,

232–233
what to store in version control repository,

230–232
Requirements

user stories representing, 91–92
in waterfall development approach, 30

Responsibility, self-organization requires shared
responsibility, 128–130

Retrospectives
in co-dev user groups, 75–76
on failed DW/BI project, xxi–xxii
iteration review, 34, 63–64

Review, stakeholder, 129
Revision history, in repository, 235
Rewind ability, benefits of version control, 227
Road map, for adoption strategies, 303–304
Role-playing, in collaboration session, 62
Roles, in writing user stories, 93–95
Ross, Margy, 154, 177

S
Sadalage, Pramod, 158, 162–163, 165–166, 168, 212,

xxxiv
Sandboxes

for continuous integration, 259
for experimentation, 298–299
for testing, 211–215
version control and, 227

Satir Change Model, 300–301
Scheduled builds, 261, 276
Scheduling, in scope, schedule, and cost triangle,

41, 43
Schwaber, Ken, 37–38
Scope, in scope, schedule, and cost triangle, 41, 43
Scribble frames, 196
Scripts, for assisting development, 284–285
Scrum

Agile Analytics compared with, 11
flavors of Agile development, 37–38
resources on, 306

Scrum master role, 27
Scumniotales, John, 37
SDUF (sufficient design up front), 144
Self-discipline, required by self-organizing teams,

126–127
Self-management, 121
Self-organizing teams

in APM, 53
characteristics of Agile Analytics, 9
corporate alignment required, 136–137
glass-house development and, 134–136
honoring commitments, 132–133
overview of, 121–122
scenario illustrating use of, 122–126
self-discipline in, 126–127
shared responsibility in, 128–130
summary (wrap up) of, 137
watching out for hangovers, 133–134
what they are, 122
working agreements in, 130–131

Setup time, for automated testing, 221–222
Shared responsibility, required by self-organizing

teams, 128–130
Sharing, benefits of version control, 227
Showcases

aligning stakeholder expectations, 79–80
for feature review and acceptance, 61–63

326 INDEX

Showcases (continued )
for features, 260
glass-house development and, 136
iterations ending with, 34
for project management sponsors and stakehold-

ers, 63
Silverston, Len, 306
Simplicity, traits of Agile modeling, 150
“Six Thinking Hats” method (de Bono), 123–126
Slebodnick, James, xxxiii
Software

Agile Analytics compared with Agile software
development, 3

frequent release of production-quality, 193
key perspectives in testing, 197–198

SOR (System of Record) database, for message-
driven warehouse, 177, 187–188

Source code, in release package, 284
Source systems, in data warehousing architecture, 13
sp/, in project directory structure, 242–243
Split Table refactoring, 160
Sponsors, in adopting Agile approach, 302–303
Spreadmarts (Eckerson), 291–292
sql/, in project directory structure, 243
SQLUnit, as database unit testing tool, 205–206
Stakeholders

diversity in consumer community and, 73
periodic review, 129
showcases for aligning expectations of, 79–80

Stand Back and Deliver (Pixton, et al.), 109
Stand-up meetings

synchronization of work by teams on daily basis, 41
timeboxing, 43

Standards
compared with patterns, 152
selecting version control tools, 253

STDD (storytest-driven development), 99, 218–220
Stories. See User stories
Story conference (Hughes), 86
Story-point estimating, 111–117
Story-test cases, 99
Story-writing workshop, 86–89
Storytest-driven development (STDD), 99, 218–220
Storytests

in Agile testing framework, 199
generating, 219–220
process under test, 204
in test-driven development, 218

Strategies, for testing, 198–201
Stress testing, 201
Stretch goals, 89
Strict locking, conflict resolution, 238, 240
Subject areas, parking lot diagrams and, 117
Subject-specific data marts, 15
Subversion

selecting version control tools, 254
storing code in CM repository, 212

Successes, measuring success of adoption process,
305

Sufficiency
applying the use of barely sufficient processes, 8
barely sufficient data modeling, 40
vs. ad hoc or hacking, 12

Sufficient design up front (SDUF), 144
Sutherland, Jeff, 37
svcs/, in project directory structure, 242
Synchronization, of project on daily basis, 41
System of Record (SOR) database, for message-

driven warehouse, 177, 187–188
Systems

design in waterfall development approach, 30
integration, 12
testing, 197–198
validation (Marick’s perspectives for acceptance

testing), 198

T
Tags

creating, 229
naming, 248–249
standards for, 245
version control capabilities, 236–237
when to tag, 245–248

Tasks
monitoring feature completion not task time, 54–56
task management as focus of traditional develop-

ment, 36
TDD (test-driven development)

overview of, 215
storytests, 218–220
unit testing, 215–218

TDWI (The Data Warehousing Institute), 306, xvii,
xxxv

Teams
accountability of, 128

INDEX 327

colocation of, 44–45
making Agile practices habitual, 292
self-organizing and self-managing. See self-orga-

nizing teams
synchronization of work on daily basis, 41
team management as focus of Agile project man-

agement, 36
Technical acceptability, Marick’s perspectives for

acceptance testing, 198
Technical debt

CoC (cost of change) and, 155
managing, 45–46, 154–157
monetizing, 155–156
overview of, 154–155
prioritizing, 156–157

Technology, handling emerging, 298–299
Test automation

challenges in, 201–203
DbFit utility for, 216
emerging technologies impacting, 299
overview of, 193
requiring shift in team work habits, 292
setup time for, 221–222

Test cases, storing in version control repository, 232
Test data set, in BI testing process, 203
Test-driven development. See TDD (test-driven

development)
Test-Driven Development: By Example (Beck), 215
Test-first approach. See also TDD (test-driven devel-

opment), 215
test/, in project directory structure, 243
Test suites

in release package, 284
storing in version control repository, 232

Testing frameworks, in release package, 284
Tests/testing

black box technologies for BI testing, 211
challenges in automating, 201–203
database testing tools, 205–209
defining testing tasks in build automation,

271–273
functional approach to BI testing, 222–223
guidelines for BI testing, 220–221
key perspectives in testing software and systems,

197–198
key testing points in data warehouse architec-

ture, 209–211
overview of, 193–194

process of BI testing, 203–205
sandbox for, 211–215
scenario illustrating use of, 195–197
setup time for automated testing, 221–222
storing testing frameworks in version control

repository, 273
strategies, 198–201
summary (wrap up) of, 223–224
test-first approach. See TDD (test-driven

development)
what is Agile Analytics testing, 194–195

Time zones, as impediment to collaboration, 78
Timeboxing

applying to releases, iterations, and schedules, 43
benefits of, 44
developed by DSDM Consortium, 42
proof-of-concept projects, 298

Tools
for build automation, 268
for continuous integration, 277
for database testing, 205–209
lack of support making Agile development dif-

ficult, 21
precursors for Agile projects, 79–80
for use in Agile testing framework, 199–201
for version control, 252–254

TortoiseSVN, 254
Tracking progress, on daily basis, 49–53
Training, as part of adoption strategy, 303–305
Triggered builds

overview of, 277
types of automation, 261

TSQLUnit, 206

U
UML (Unified Modeling Language), 96
Unit testing

in Agile testing framework, 198
database testing tools, 205–206
overview of, 195
process under test, 204
in test-first approach, 215–218
tools for automating, 199
when it occurs, 200

Up-front design
avoiding high costs of change later, 297
BDUF (Big Design Up Front), 144

328 INDEX

Up-front design (continued )
determining amount of, 148–149
SDUF (sufficient design up front), 144

Updating workspace, 234–235
Usability testing, 200
Use-case modeling

diagram and details, 87, 97
finding user stories in event f lows, 98
overview of, 96–97
scenarios, 98–99
in story-writing workshop, 86–88

User roles
in development of user stories, 93–95
use-case modeling and, 96

User stories
backlog management, 111
basing on smallest, simplest thing, 103–107
capability-based prioritization, 109–110
caution regarding epics and antistories, 90–91
characteristics of well-written, 89–90
decomposing epics, 99–102
finding in event f lows, 98
monitoring feature completion not task time,

54–56
overview of, 85–86
parking lot diagrams, 117–119
presenting on scribble frames, 196
prioritization of product backlog, 107–108
prioritization process, 110
representing requirements, 91–92
small user stories in Agile product driven devel-

opment, 34
story-point estimating, 111–117
story-test cases, 99
story-writing workshop, 86–89
summary (wrap up) of, 119–120
use-case modeling, 96–99
user roles and, 93–95
value-based prioritization, 108–109
what they are, 86

Users
acceptance tests. See Acceptance testing
collaboration. See Consumer collaboration
focusing on customer satisfaction, 291–292
functional testing of user controls, 223
not short-circuiting customer collaboration,

294–295

project success/failure measured by user satisfac-
tion, 18–19

scripting user authentication, 285
util/, in project directory structure, 243
Utility scripts, prerequisites for project automation, 262
utPL/SQL, 206

V
Value-based prioritization, of product backlog,

108–109
Value-driven development, 8
Velocity, of development

establishing for project, 89
story-point estimating and, 115–116
tracking against capacity, 47–49

vendor/, in project directory structure, 243
Vendors, offering Agile enabled technologies, 306–307
vendorsrc/, in project directory structure, 243
Version control

choosing tool for, 252–254
conflict resolution, 238–240
directory structure for, 241–245
explanatory files in repository, 241
keeping things simple, 251–252
naming tags and branches, 248–249
organizing repository, 240
overview of, 225–226
prerequisites for project automation, 261
repository for, 230
revision history in repository, 235
scenarios, 227–229, 249–250
storing testing frameworks in version control

repository, 273
summary (wrap up) of, 254–256
tags, branches, and merging capabilities, 236–238
test guidelines, 221
tracking versions, 287–288
what it is, 226–227, 229–230
what not to store, 232–233
what to store, 230–232
when to tag and branch, 245–248
working with files and, 233–235

views/, in project directory structure, 244
Views, scripting user-defined, 285
Virtual colocation (Highsmith), 45, 77–78
Virtualization, benefits of, 275

INDEX 329

Vision. See also envisioning, 32–33
Visual controls, glass-house development and, 135–136
Voice-to-voice communication, in collaboration, 72
VoIP, for virtual colocation, 77

W
WaitN, tool for acceptance testing, 195
Warehouse. See Data warehousing
Waterfall development approach

collaboration in, 60
Envision®Explore cycle as alternative to, 30–31
small steps in Agile development mirroring

stages of, 41
WBS (work breakdown structure)

estimating based on, 112
traditional planning based on, 35

Web cameras, for virtual colocation, 77
WhereScape RED, vendors offering Agile enabled

technologies, 306
Whiteboard

in story-writing workshop, 86–87
use by team in problem solving, 123–124

Wikis, glass-house development and, 136
Wish-based planning, vs. capacity-based planning, 49
Work breakdown structure (WBS)

estimating based on, 112
traditional planning based on, 35

Working agreements
glass-house development and, 135
required by self-organizing teams, 130–131

Workmanship, precursors for Agile projects, 80–81
Workspace

updating frequently, 234–235
for working with files, 233

X
XMLA files, 231
xmla/, in project directory structure, 244
XP (eXtreme Programming)

Agile Analytics compared with, 11
flavors of Agile development, 37
unit-test-driven development and, 215

xUnit
automating unit testing, 215
frameworks based on, 221
testing tools, 205–206

Y
YAGNI (You Ain’t Gonna Need It), 40

Z
Zinkgraf, Dale, xxxiv

Calculate your order
Pages (275 words)
Standard price: $0.00
Client Reviews
4.9
Sitejabber
4.6
Trustpilot
4.8
Our Guarantees
100% Confidentiality
Information about customers is confidential and never disclosed to third parties.
Original Writing
We complete all papers from scratch. You can get a plagiarism report.
Timely Delivery
No missed deadlines – 97% of assignments are completed in time.
Money Back
If you're confident that a writer didn't follow your order details, ask for a refund.

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
$0.00
Power up Your Academic Success with the
Team of Professionals. We’ve Got Your Back.
Power up Your Study Success with Experts We’ve Got Your Back.

Order your essay today and save 30% with the discount code ESSAYHELP