As xAPI Profiles, to some extent, codify what an organization values in human performance, when one implements an xAPI Profile, one implements a translucent set of rules that bind a learning experience. Ideally these should reflect the values of the organization responsible for producing the xAPI Profile. These rules quickly become invisible to the learner and even the stakeholders themselves. These rules quickly shape operating behaviors. Any bias or logical flaw will be recognized as reporting, and decisions based on such reporting, play out over time. An xAPI Profile can impact how learning and work performance are quantified and evaluated by leaders. This completely removes it from the visibility or perspective of the people doing the work being evaluated. As a result, it is an ethical, financial, and potentially a liability imperative to be transparent and to communicate well in the process of defining xAPI Profiles in order to provide meaningful metrics for evaluation.
Demonstrably, this section of this particular document in the xAPI Profile Server Library attempts to expose such tacit and implicit information for the reader to provide an architect's perspective. This is done in hope of helping the reader understand the difference between what is technically capable and what is pragmatically sound, so future technology and practice may mature independently.
# Prioritizing What To Measure
The authoring of xAPI Profiles should mature into a skilled professional practice, with refined pedagogical models that stakeholders, practitioners and even learners alike may broadly recognize and understand.
This document does not describe a process and methodology to gather the research needed to inform an xAPI Profile. At the time of this writing, however, xAPI Profiles are in their infancy and while this document does not illustrate a specific methodology in the development of an xAPI Profile, it does provide context, principles and practical implementation considerations for authoring xAPI Profiles. The pioneers working with xAPI Profiles since 2017 suggest that it is not good practice to track what's easy to identify or simply track what engineers in a specification group suggest. Instead the author should answer the question "What should we measure?" This is done by considering the value the organization hopes to realize for itself, if not for the learners themselves.
Analytics, in general, reflects how an organization demonstrates accountability. To that end the effectiveness of an xAPI Profile is largely dependent on high correlation between what people in the organization say they value in research, and how those people assign value in practice. In other words, it's important to capture requirements that translate to solutions with measurable success criteria across observable developmental milestones.
There are three vectors to consider what to measure with analytics. There's a financial consideration which in practice is likely to be a gatekeeper for many analytics projects. The financial obligations for capturing, processing and managing analytics should be weighed against the value to the organization of the information, in terms of savings, in terms of identifying new areas of opportunity.
Prioritizing analytics should go beyond measures of profit, loss, cost, and expense. While financial information often appears accessible and transparent, numbers have a way of being manipulated to clever ends, so prioritizing what to measure must include other considerations. Engagement or satisfaction is relevant to learning activity, so in consideration of what to measure should not be limited simply to what informs investment and budget decisions. Consider how access to certain information about the learning experience may benefit the learner's experience directly and indirectly. How might information provide early warning signs of capacity challenges, knowledge gaps or overtraining?
Ultimately, what must be considered in prioritizing what to track is if the measurement is ethically sound. Protecting personal data and individual ethics concerns are top priority. How learning and performance success is interpreted will be reinforced and largely structured by the schema of an xAPI Profile.
# Recognizing Likely Reasons for a New xAPI Profile
There are no technical rules related to the conditions under which an xAPI Profile is created. With seven years of informal and formal practices around learning data strategies and xAPI Profiles in particular, three scenarios in particular lend themselves to developing a new xAPI Profile.
- A wholly unique learning experience requires a data architecture. For example: a trade group is working on a competency model expressed with an xAPI Profile for normalized measurement and analysis of conformant learning activities. This allows multiple publishers of training content, instructional aids, authoring tools and reporting services to work dynamically with the same set of semantically and technically interoperable frameworks.
- A common definition for a modality of learning. For example: A standards group is charged with translating anticipated content/client data exchanges and user interactions into a measurable performance context. A hypothetical scenario could be: the Veterans Administration working with IEEE to standardize learning interactions (modalities) for nurses within an Emergency Health Record System to evaluate against trackable patient outcomes.
- An organization intends to govern its own xAPI data model. For example, the Naval Education and Training Center (NETC) codifies an xAPI Profile to normalize its capture and analysis of performance support activities, such as checklists, and their use for instructors and sailors. Adopters of the Total Learning Architecture (TLA) codifies an xAPI Profile describing valid statements and expected systemic outcomes that depend on particular properties of data captured.