The Behavior Bank: The Secret to Success

2/19/2015
 

Submitted by Guest Blogger Ryan O’DonnellBrohavior

The Behavior Bank

According to Carl Koenig, a graduate student of Ogden (Og) Lindsley’s in the late 1960s to early 1970s, Og was approached by a parent asking how to best teach a certain concept. Og rifled through a series of charts in his hands to best answer the question, but also realized the answer should rely on a larger dataset. Analyzing on a large-scale was (and for the most part continues to be) left to the individual scientist with a plethora of practical barriers limiting the progression of behavioral science and Precision Teaching. The greatest barriers to analyzing large datasets include: 

1. Having an extensive training in visual analysis of data
2. Possessing knowledge about various treatment programs implemented on an individual level, as well as across individuals (i.e., historical context within and across people served)
3. It is an extremely time-intensive procedure  

Unfortunately, the majority of practitioner training programs severely under-value effective training in visual analysis and the various tools available to scientists (e.g., the Standard Celeration Chart [SCC]). Additionally, most practitioners are trained on a standard set of procedures identified by the agreed upon standards or best practices of a field (e.g., the Behavior Analysis Certification Board Task List). The alternative is an elegantly designed instructional content that aims to promote outcome-based results demonstrating both fluency and generativity in various areas of behavioral science. However, the time-intensive practice of comparing datasets is very intriguing. Across various sciences, the automation of diverse tasks has demonstrated how humanity significantly benefits from appropriate computer programming. Yet, the field of the behavioral sciences and education has yet to really harness this power.  This is where the true beauty of Ogden Lindsley’s visionary mind comes out.

After rifling through charts to answer the parent’s question, Og appointed Carl Koenig to oversee the creation and implementation of the Behavior Bank. The Behavior Bank was a simple idea: for each complete SCC submitted to the bank one could ask a question that would be answered based on the data within the bank. The ultimate goal of the Behavior Bank was to gather as much data as possible on human behavior, store it all in one location, and mine the data to identify various relationships, such as:

Frequency and Celeration Expectancies (how fast or slow and in what direction will something occur?)
Frequency and Celeration Multiplier Expectancies (how big of a change and in what direction will intervention changes directly cause?)
Effective and Ineffective Teaching Procedures (what works and what does not work well under given circumstances?)

The practical limitations, however, were overwhelming and played a strong role in the fall of the Behavior Bank. The first was that the submission forms were completed by hand on over 20 different scantrons. The forms collected all of the vital information captured on a SCC: the chart labels, the values of various data points, the celeration and bounce values, the condition change dates and nature of the changes, and so on. Koenig (2012) reported that there was a full-time clerk that checked the accuracy of each of the scantrons prior to them being inserted into an IBM 1130 model computer (later upgrade to an IBM 360 and 370). The practical limitations also battered Koenig’s team. The team required an average of 3.5 months to complete the submission process for each project (between 2 weeks and 27 months), with direct monetary costs to store a project ranging from $1-$3 in the early 1970s. In other words, $10 per submission in today’s market and with over 30,000+ submissions, adds up quickly! Koenig (2012) reported the ultimate cost of the Behavior Bank was around $250,000 in the 1970s – roughly $1.3 million today! 

 

Picture
Back to the Future - "IBM1130Bletchley" by Martin Skøtt - Flickr: IBM 1130. Licensed under CC BY-SA 2.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:IBM1130Bletchley.jpg#mediaviewer/File:IBM1130Bletchley.jpg


The second limitation was the value of the Behavior Bank itself. The concept relied on having sufficient data to answer questions practitioners were struggling to understand on their own. In essence, the bank was only valuable once the bank was (partially) complete. The rate of questions being answered couldn’t keep pace with the rate at which they were asked. Therefore, the end product for users was essentially nothing. Answers cost about $3,000 (currently) and each took an average of 4 months per answer (1-334 weeks).

Each of the previously listed limitations presents an unfortunate anomaly – the Behavior Bank was about 50 years ahead of its time. The computing power of super computers used in the 1970s are almost equivalent to the processor in an iPhone 4. The idea of a mainframe computer being a centralized point is no longer geographically limited with “cloud-based” computing. Equally important, the programming capabilities of today allow for automation of both visual analysis and the extensive historical knowledge required to analyze across datasets (barriers 1 and 2 respectively, noted above) at lightning speed (barrier 3 above). Ogden not only was on the forefront of this technology, but he was so far ahead of his time that it limited any chance of the wider educational and performance focused community adopting the practice. Additionally, people like Koenig left the behavioral sciences upon the fallout of the behavior bank to never look back. 

Big Data

This concept of a Behavior Bank is closely related to the buzz word “Big Data.” The importance placed on single case research is clearly an important distinguishing characteristic in educational and many psychological traditions (e.g., behavior analysis). However, as the progression of technology continues to grow exponentially, it may be time to venture outside the comfort zone of N = 1 and venture into the world of N = All. This is what is meant by “big data”: N = All.  One perceived limitation of this approach (held even by the giants of the PT community today) is that the data is so contextual and individualized that it is impossible to conclude anything from aggregating the data. The sole issue with the critique is that if it were true, then collecting data (much less charting it) would be functionally useless. 

Blending Big Data and Single-Case Data

Single-case traditions such as behavior analysis and Precision Teaching dwell in a peculiar place in which they could aggregate single-case research completed from across the world. Wouldn’t it be nice to see that 99.99% of data that are collected but instead end up in banker boxes never to see the light of day? Yes, there are many practical hurdles to navigate; however, only two are currently preventing a large step towards a modern day Behavior Bank. The first is the appropriate company to position themselves in a place such that they can provide society with an effortless way to contribute to the bank.  Chartlytics has a chance at this and I hope they continue to keep this in mind as they grow.  The second is a shift in the charting community’s culture in which sharing of data is encouraged and not left to the influences of a capitalistic culture. The current restructuring of the Standard Celeration Society will be pivotal in creating the contingencies to influence the cultural change required, and will likely require the help of additional organizations.  

What is the Secret to Success?

I recently came across a motivational speech on YouTube that I’d like to shamelessly relate to the adoption of a Behavior Bank.  Arnold Schwarzenegger lists 6 Rules to Success:

1.     Trust Yourself

Chart your data and don’t be afraid to share your data.  

2.     Break Some Rules

Standards are extremely important, but there are some that are more important than others.  Consider the 34 degree angle of a times 2 (doubling) representation of growth on an SCC, it’s like a law not to be broken! However, the color of your data paths, or the data point markers that you utilize can change if it leads to better analysis or further adoption of the SCC and Precision Teaching. Above all – follow the data though.

3.     Don’t Be Afraid to Fail

The Precision Teaching process includes the acknowledgement of failure (Pinpoint -> record -> change -> TRY AGAIN).  It’s going to happen.  Own it and re-try!

4.     Ignore the Naysayers

The practical limitation of a modern Behavior Bank should not limit the progression of the science.  The entire educational and learning science community should align those people with shared values in support of this venture and commit. The benefits generated in other fields surely make this a Pepsi Challenge

5.     Work like Hell

With the majority of the technological barriers knocked down, and with both Chartlytics and the Standard Celeration Society positioning themselves to potentially provide support, the work is going to be left to the practitioners. 

6.     Give Something Back

The nature of the Behavior Bank is to provide the community the systems necessary to expand the empirical knowledgebase in order to benefit society at large.  The capitalistic culture and push for privatization within the certain communities (e.g., behavior analysis) has provided significant barriers to the prosocial behavior that is required to usher Precision Teaching in the limelight. Let’s come together for the greater good and put everything else aside! 

Anyone interested in discussing this further, comment below or email me directly at ryanodonnell23@gmail.com

Peace, Love, and happiness!

#ChartOn

Ryan O’Donnell

Subscribe to Email Updates