An Inside Look at JA Program Development
Author: Hannah Henry
Published: Tuesday, 18 Sep 2018
From financial literacy and work readiness to entrepreneurship, Junior Achievement (JA) provides countless programs to equip today?s youth with skills that will assist in their successes tomorrow.
To get a better idea of how we are creating our programming, we pulled in our Senior Vice President of Education & Learning Technologies, Mary Catherine (MC) Desrosiers for an interview. In her role, MC has oversight of the education group, managing the blended transformation and directing ideation, design, implementation, and evaluation of new program and learning technology.
Q: What is your job role?
Senior Vice President, Education and Learning Technologies
Q: How long have you been with JA?
Q: How long does it take to create and launch a program/learning experience?
A program doesn't have a set amount of time to create and launch. What's most important are the stages that we go through in order to conceive of and create a new program. From the time we start researching until the time we launch a pilot, it's typically a year.
Q: What stages are needed / used from start to implementation?
Discovery, Design, Production, Launch, Implementation, and Evaluation.
Q: How are JA programs/learning experiences developed?
MC: We start by identifying real, high-priority opportunities. Then we lay the foundation for success for a specific opportunity through user and market research that identifies unmet needs and clarifies our goals, scope, and audience needs. We undergo a process of discovery at a strategic and program level, meaning that we are constantly evaluating our market and our users and deciding what we need to teach and to whom. That research informs our decisions about which programs to develop, why we should develop them, and whose needs we can meet. For each program we develop, we undergo a specific discovery process and use design thinking to develop our prototypes.
During our design phase, we focus on creating a program vision, understanding specific users' needs, and describing a new experience that is engaging and promises to demonstrate the desired learning outcomes. Design has two parts: Concept Design and Prototyping; and Program and Content Design.
During concept design and prototyping, we make ideas tangible and test them with the audiences and other stakeholders. The result of user testing may be that one concept is a clear winner, that elements of both are successful, or that another idea emerges--or that we need to rethink. During program design, we work through the user journey and define the requirements for each user group. We work through our technical architecture and specifications, figure out the "look and feel" of the program, design an assessment and evaluation strategy. After all that is finished, we can start writing and developing and creating the learning materials for our students, volunteers, and educators.
Q: What factors or elements are considered?
There are a lot of factors considered in each phase of our process. I'll name a few. As we do our strategic discovery, we investigate general and education market trends, the latest learning research, the competition, state standards, and trends in learning experiences in technology. We consider our diverse audiences and their needs, the trends or development in a specific area of content, and the various learning environments in which our content may be delivered.
Q: How do you go about testing?
We start with the end in mind, so we are always working toward a goal or plan that can be tested. Production is an iterative process: testing and revising are ongoing, with each round of testing informing the next steps in development. We use both lean product development and rapid innovation testing. By the time we get to alpha testing, we have the first version of the entire program, the technology architecture is finalized, and the programmers have integrated the technology that will enhance program delivery.
We pilot all of our programs. Field testing and providing support to the JA Areas during implementation are critical to the success of JA programs. During the implementation phase, the team, led by JA USA Field Program Services, tests all components of the program with selected JA Area pilot sites (beta test), creates all training materials, and prepares implementation guidelines for the JA Areas.
A formative evaluation is designed and used to improve the program, especially when it is still being developed. During Design, Production, and Implementation, we collect and analyze qualitative data to understand how well a program is working and ways we might improve it.
During beta testing, the JA evaluation group conducts a formal formative evaluation that explores how well the program elements work and align with intended learning objectives.
A "launch impact evaluation" describes the assessment we conduct on a newly-developed blended program during the pilot phase/alpha testing. For kit-based programs, we continue to refer to Phase 1 as a formative evaluation.
When all the components are complete and tested, we fully launch a program to the JA Areas. The JA Areas recruit and train volunteers and work with schools to deliver the program.
But our testing doesn't stop once the program is in the field. Once we formally launch a program to the JA network, we begin a summative evaluation designed to present conclusions about the merit or worth of an intervention and recommendations about whether it should be retained, altered, or eliminated.
A "comprehensive impact evaluation" describes the summative evaluation of a blended program that is conducted after the learning environment has stabilized. It measures student learning gains, changes in perceptions and attitudes, and other meaningful dimensions of interest.
Q: Who do you consult with when developing a program?
We consult with subject matter experts in content areas as well as people who have expertise in working with learners of different ages. We consult with teachers, industry leaders, and our JA Area partners, particularly our pilot sites, but we seek to get information from our R&D's too. And last, but certainly not least, we consult with students. We speak to kids to get their feedback and involve them in our design thinking process.
Q: What changes in the education / lesson planning industry have you (or your team) had to navigate?
The education marketplace is continually changing. Incorporating technologies into blended programming while still meeting the needs of classrooms without technology has been important. Incorporating evolving technologies like Augmented Reality or keeping pace with educational trends such as project-based learning, the flipped classroom, brain-based learning research, and cognitive science keep us busy.
Q: What changes do you anticipate for the future?
I anticipate a focus on self-efficacy and competency-based learning.
Q: How has the integration of innovative technology like VR changed how students learn with JA programs?
We continue to consider ways in which we can provide simulated experiences, like JA Finance Park Virtual or the small augmented reality experiences in JA Our City. Since we're preparing students for "the real world," the virtual world holds some exciting possibilities for us.