Digital Policy is Pedagogy: Why Educators Must Engage

Opening Provocation: Policy is Pedagogy
Walk into any strategic planning meeting at a university and you’re likely to hear talk of digital roadmaps, enterprise procurement, or IT governance. These conversations are often framed as operational or infrastructural concerns - necessary for institutional efficiency, risk mitigation, and digital transformation. What’s striking is how rarely they are understood as pedagogical in nature.
Yet every digital decision made by an institution - whether about the adoption of a new learning platform, the use of analytics dashboards, or the licensing of generative AI tools - fundamentally shapes the environment in which teaching and learning occur. The selection of an LMS configures what counts as assessable, observable, and communicable learning. Data policies dictate what forms of student activity are made visible or rendered actionable. Even the user interface of a dashboard encodes a particular model of learner behaviour and teacher responsibility.
This is not merely an issue of tools. It is a matter of values. As Selwyn (2022) argues, educational technologies always carry embedded assumptions about what learning is and how it should be governed. The decision to adopt a given system is never neutral - it entails a commitment to certain pedagogical logics, often privileging efficiency, scalability, and control over relationality, deliberation, or creativity.
The tendency to silo digital policy as separate from pedagogy reflects a wider trend identified by Williamson and Hogan (2020), in which the pandemic accelerated a managerial turn in education - one that privileges commercial platforms, standardised metrics, and technical solutions over pedagogical dialogue and democratic governance. In this framing, educators become end-users of systems designed elsewhere, invited to “embed” technologies that have already been procured. But this compliance model denies the expertise and professional judgement of educators, who are uniquely positioned to evaluate how digital infrastructures align - or clash - with meaningful learning.
Indeed, as Watters (2023) contends, decisions about technology are always decisions about power, agency, and the future of education. To abdicate those decisions to IT departments or external vendors is to risk the erosion of pedagogical intent. If digital policy is left to technocratic logic alone, the result is not neutrality but the quiet normalisation of certain forms of learning - those that are most easily measured, automated, and scaled.
The central claim of this post, then, is simple but urgent: digital policy is pedagogical. It shapes the conditions of educational possibility. Educators must engage with it - not only to protect their professional agency but to advocate for learning environments grounded in ethics, care, and educational purpose.
From Tools to Logics: How Procurement Shapes Pedagogy
Procurement in higher education is rarely neutral. Decisions about which platforms, systems, or tools to adopt - whether virtual learning environments (VLEs), plagiarism detectors, or predictive analytics dashboards - are often framed as technical or financial choices. Yet these decisions carry profound pedagogical consequences.
When procurement bypasses meaningful consultation with educators, it tends to prioritise features and workflows aligned with managerial goals: standardisation, scalability, efficiency, and compliance. Learning management systems (LMSs), for example, often arrive pre-configured with templates for assessments, quizzes, and discussion boards that reflect behaviourist or transmissive pedagogies. These embedded defaults shape the possibilities for teaching and learning before a course has even been designed (Brown and Duguid, 2000; Selwyn, 2022).
The logic of these platforms is rarely neutral. As Gert Biesta (2010) reminds us, education always involves values and choices - about what is worthwhile to know, how learners should engage, and what counts as success. But when assessment tools are built around rubrics designed for automated scoring, or when analytics dashboards rank students according to click patterns and submission rates, the priorities of pedagogy are silently displaced by the priorities of platform design.
As Williamson (2017) observes, the increasing datafication of education allows systems designers and vendors to subtly embed assumptions about how learning occurs and how it should be managed. This leads to a situation where educators are not merely using tools but are working within logics those tools have already decided. The result is a narrowing of pedagogical agency. Teaching becomes a process of fitting one’s practice to the contours of the platform, rather than the platform being moulded to support diverse pedagogical goals.
Moreover, once systems are procured, they are often “locked in” through long-term contracts, technical integration, and sunk cost. Changing or challenging them becomes difficult, even when their pedagogical fit is poor. As Macgilchrist (2021) notes, the framing of digital tools as inevitable and universally beneficial obscures their contested nature. By treating platforms and data infrastructures as apolitical solutions, institutions often foreclose the very debates educators should be having - about whether, how, and why such tools should shape teaching and learning at all.
Educators may still find workarounds - repurposing discussion boards for collaborative storytelling, or subverting grading tools to facilitate feedback dialogues - but such resistance often requires additional labour and operates at the margins. The core structures remain governed by logics not of teaching, but of compliance and control.
To contest this, educators must be involved not only in the implementation of tools but in the earliest stages of decision-making. Procurement is pedagogy by other means. Unless educators are part of these conversations, they risk teaching within environments whose values they did not choose.
Data, Governance, and the Reconfiguration of the Learner
The integration of learning analytics and data-driven platforms into higher education has profoundly reconfigured how learners are perceived, assessed, and governed. No longer seen merely as participants in a pedagogical process, students are increasingly rendered as data subjects - quantified entities to be tracked, predicted, and nudged.
This process, often termed datafication, involves transforming educational activity into measurable digital traces that can be analysed and acted upon (van Dijck, 2014; Williamson, 2017). Learning, once a multidimensional and often unpredictable process, becomes flattened into variables: time on task, click rates, assessment scores, submission patterns. While these indicators may offer useful insights, they also risk reducing complex acts of thinking and becoming to behavioural proxies.
One of the critical concerns here is what Knox, Williamson, and Bayne (2019) term machine behaviourism - the idea that student learning and performance can be understood and shaped through behavioural data alone, without recourse to pedagogical relationships or human interpretation. As algorithms become increasingly involved in feedback loops - flagging “at-risk” students, auto-generating nudges, or predicting future attainment - they begin to exercise a form of automated governance over learners’ trajectories. This shifts the locus of educational authority away from the classroom and towards opaque infrastructures, often managed by third-party vendors with limited pedagogical oversight.
Such systems do not merely reflect learning; they construct it. They define what counts as success, shape the metrics of engagement, and frame students through predefined behavioural patterns. As Slade and Prinsloo (2013) argue, learning analytics must be understood not only as technical systems but as ethical and political interventions that can profoundly influence how learners are perceived and treated. The ethical stakes are considerable. The learner risks becoming an object of prediction and control rather than a co-constructor of knowledge. Data profiles can essentialise students, reducing them to fixed categories that persist across platforms, courses, and institutional contexts - often without their knowledge or meaningful consent.
To navigate this terrain, educators and institutions need to foster personal data literacies - the capacity to understand how digital data is collected, interpreted, and repurposed, and to critically reflect on its implications for autonomy, privacy, and participation (Pangrazio and Selwyn, 2019). Without this critical perspective, there is a danger that the use of analytics will bypass professional judgement and relational pedagogy in favour of managerial oversight and predictive control.
Ultimately, the governance of education through data is not a neutral or technical shift. It is a deeply pedagogical and political act - one that redefines what it means to teach, to learn, and to be a learner in the digital university.
Policy Silos and the Marginalisation of Educators
In many institutions, digital policy is shaped not in classrooms or staff meetings but in boardrooms, procurement offices, and IT departments. This structural disconnection creates a profound disconnect between those who make decisions about digital infrastructure and those who must live with its pedagogical consequences. While institutional leaders speak the language of innovation and agility, educators are often presented with preconfigured systems, platforms, and protocols - expected to “embed” them into their teaching without having had a meaningful role in their selection or design (Burdon and Harpur, 2014).
This marginalisation of educators in policy-making processes is not merely procedural; it is ideological. It reflects a model of governance in which technical efficiency, managerial oversight, and cost-saving imperatives take precedence over pedagogical reflection or ethical scrutiny. As Williamson (2017) shows, the rise of data-driven governance in education has reframed decision-making in terms of measurement, audit, and algorithmic rationality, sidelining the lived realities and professional judgement of educators. Technologies are not just tools - they embody logics that, once operationalised, become embedded in institutional routines and difficult to contest.
The result is a form of pedagogical complicity. Educators are drawn into implementing policies and technologies that may run counter to their values or professional knowledge, simply because they have little space to resist. As Macgilchrist, Allert and Bruch (2020) warn, when policy decisions are treated as technical inevitabilities rather than contested choices, the educational community loses its capacity to imagine otherwise. The silencing of dissent - or even discomfort - risks cultivating a culture where critical engagement with technology is perceived as obstructionist rather than essential.
Moreover, this disempowerment has a chilling effect on professional agency. When teaching becomes the site of compliance rather than creativity, when choices about pedagogy are driven by the logic of the system rather than the needs of learners, something essential is lost. The challenge is not only to resist bad policy but to build institutional cultures in which educators are recognised as co-creators of digital strategy, not just implementers of it.
Reclaiming Policy Spaces: What Engagement Looks Like
If digital policy is pedagogical, then educators must find ways to engage meaningfully in its formation. This does not mean simply reacting to decisions after the fact or offering superficial “input” on predetermined outcomes. It means recognising digital governance as part of one’s professional terrain - and reclaiming space within it.
Educator engagement in policy processes can take many forms: participating in institutional committees on learning technologies, contributing to consultations about platform procurement, or co-leading strategic initiatives on digital futures. These sites, while often bureaucratic and imperfect, are spaces where pedagogical values can be articulated, defended, and embedded. As Biesta (2010) argues, education is not just about outcomes but about what and who education is for. The same must be asked of technologies: who do they serve, and what kinds of educational futures do they make possible?
Examples of meaningful engagement already exist. Co-design initiatives, in which educators and students collaboratively shape technology selection or learning design, offer a model of participatory innovation. Peer-led approaches - such as those shown to improve engagement and achievement among community college STEM students - demonstrate the potential of collaborative, learner-centred pedagogies to inform technology adoption and curriculum design (Meador et al., 2024). Critical EdTech review groups - such as those envisioned in speculative accounts of post-pandemic education - remind us that technological futures are not fixed but open to reimagination (Costello et al., 2020). Even procurement processes, often seen as closed or technocratic, can include participatory panels, transparent evaluation criteria, and user-centred pilots that foreground pedagogical and ethical considerations.
Yet such engagement is not easy. Structural barriers persist. Time is perhaps the most significant: educators are already overloaded with teaching, administration, and research. Access is uneven: decisions are often made in senior management circles to which few academic staff belong. Institutional hierarchies can inhibit dissent or critical reflection, especially when digital policy is framed as a managerial or reputational matter rather than an educational one.
Still, these barriers are not insurmountable. One strategy is to build informal networks of concern: spaces where staff can share experiences, raise questions, and coordinate action. Another is to develop critical vocabularies that can be brought into formal arenas - a language of ethics, equity, and pedagogy to counter the dominant discourse of efficiency and innovation. Finally, alliances with students and professional services staff can broaden the coalition for thoughtful, democratic digital governance.
As Selwyn (2022) notes, meaningful engagement with educational technology policy is not about rejecting innovation, but about refusing to cede educational decisions to opaque systems and commercial logics. Reclaiming policy spaces requires courage, collaboration, and the conviction that pedagogy must remain central to the digital university.
Critical Literacy as a Form of Agency
In an age where the boundaries between pedagogy and infrastructure are increasingly blurred, digital policy engagement must be reframed as a pedagogical skill. It is not merely the domain of IT departments or senior administrators; it is a matter of educational agency. Educators need to be able to “read” the systems they are asked to use - not only at the interface level, but at the structural and ideological levels that govern how these systems operate and what they prioritise (Smyrnaios, 2018). This interpretive capacity - what we might call critical policy literacy - is essential to understanding how technologies shape what can be taught, how students can participate, and which forms of learning are rendered legible.
Such literacy begins with asking key questions: What assumptions underlie the design of this platform? Who is collecting data, and for what purpose? What forms of engagement are encouraged or marginalised? Without these critical lenses, educators risk becoming passive implementers of logics they did not choose, reproducing pedagogical conditions that may contradict their values.
Crucially, this is not a call for individual heroism. Critical literacy must be cultivated collectively. Professional development that centres on ethics, governance, and the politics of educational technology can foster shared vocabularies and coalitions of practice. Projects like the Critical Digital Pedagogy network (Stommel, 2014), collaborative initiatives such as Toward a Critical Instructional Design (Quinn et al., 2022), and Freirean-inspired approaches to critical data literacy (Tygel and Kirsch, 2016) offer models for how educators might come together to examine the entanglements of data, design, and institutional decision-making. These collective spaces enable reflection on educational values and provide practical frameworks for resisting platform-determined pedagogy.
This collective understanding extends to leadership and governance. Institutional leaders also need opportunities for critical reflection and development - especially if they are tasked with shaping digital strategy. Without shared ethical and pedagogical grounding, the gap between those who teach and those who decide will only widen.
In this sense, engaging with digital policy is not an optional extra. It is a necessary form of pedagogical agency, a way of reclaiming education as a space of ethical judgement and democratic dialogue - even in the face of increasingly technocratic systems.
Closing Reflection: Policy as a Site of Pedagogical Struggle
Digital policy is never merely technical. Decisions about platforms, analytics, and infrastructure are underpinned by particular logics - often drawn from managerialism, efficiency, and surveillance - that shape what learning is, who gets to define it, and how it is measured. As Suchman (2002) reminds us, technologies do not arrive fully formed but are “always already political,” inscribed with values and assumptions about the world. In the domain of education, this means that every system adopted, every dashboard implemented, and every metric tracked carries with it a view of the learner, the teacher, and the purpose of learning itself.
The central argument of this post bears repeating: digital policy is pedagogy. It is a mode of governance that establishes the boundaries of educational possibility. When educators are treated as passive users of systems designed elsewhere, they are effectively excluded from shaping the very conditions of their practice. Yet these conditions are not immutable. They are contested spaces, shaped through negotiation, resistance, and imagination.
To reclaim these spaces, educators must move from the periphery to the centre of digital policy-making. This means rejecting the role of the “end-user” and embracing that of the co-author - someone who not only implements but helps define the goals, ethics, and principles of educational technology. As Knox, Williamson and Bayne (2019) argue, the increasing adoption of AI and data-driven systems in education risks embedding reductive models of learning based on behavioural prediction. Without critical engagement, these systems may entrench logics of machine behaviourism that sideline relational, interpretive, and human dimensions of pedagogy.
This is no small task. It requires time, support, and institutional cultures that value educational expertise as much as technical proficiency. But it is essential if universities are to resist the erosion of professional judgement and the flattening of pedagogy into platformed procedure. As Selwyn (2022) puts it, reclaiming education in the digital age demands that we “confront the politics of technology, not merely its functions.”
The challenge is not simply to critique what is, but to imagine what could be. What would it mean to build digital systems that centre care, justice, and dialogue? What policies would emerge if educational values - rather than commercial imperatives - led the design process? These are not abstract questions. They are urgent, concrete, and political.
Reclaiming institutional spaces for pedagogical dialogue is thus a form of professional and civic agency. It is a refusal to allow digital systems to dictate the terms of education, and an insistence that those who teach must also shape the tools and policies through which teaching occurs. The struggle over digital policy is, at its heart, a struggle for the soul of education.
Bibliography
- Biesta, G.J.J. (2010). Good Education in an Age of Measurement: Ethics, Politics, Democracy (1st ed.). Routledge. https://doi.org/10.4324/9781315634319 (Accessed: 26 May 2025).
- Brown, J.S. and Duguid, P. (2000). The Social Life of Information. Boston: Harvard Business School Press. Available at https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=52512&site=ehost-live&authtype=sso&custid=s4280246 (Accessed: 26 May 2025).
- Burdon, M. and Harpur, P. (2014). Re-conceptualising privacy and discrimination in an age of talent analytics. University of New South Wales Law Journal, 37(2), pp. 679–712.
- Costello, E., Brown, M., Donlon, E. and Girme, P. (2020). ‘The Pandemic Will Not be on Zoom’: A Retrospective from the Year 2050. Postdigital Science and Education, 2(3), pp. 619–627. https://doi.org/10.1007/s42438-020-00150-3 (Accessed: 26 May 2025).
- Knox, J., Williamson, B. and Bayne, S. (2019). Machine behaviourism: future visions of ’learnification’ and ’datafication’ across humans and digital technologies. Learning, Media and Technology, 45(1), pp. 31–45. https://doi.org/10.1080/17439884.2019.1623251 (Accessed: 26 May 2025).
- Macgilchrist, F. (2021). Theories of Postdigital Heterogeneity: Implications for Research on Education and Datafication. Postdigital Science and Education, 3(3), pp. 732–750. https://doi.org/10.1007/s42438-021-00232-w (Accessed: 26 May 2025).
- Macgilchrist, F., Allert, H. and Bruch, A. (2020). Students and society in the 2020s: Three future ’histories’ of education and technology. Learning, Media and Technology, 45(1), pp. 76–89. https://doi.org/10.1080/17439884.2019.1656235 (Accessed: 26 May 2025).
- Meador, A., Lockwood, P., Subburaj, V. and Subburaj, A. (2024). Examining the Effects of Peer-Led Team Learning as a Support for Community College Transfer Students’ STEM Achievement. Education Sciences, 14(9), 945. https://doi.org/10.3390/educsci14090945 (Accessed: 26 May 2025).
- Pangrazio, L. and Selwyn, N. (2019). ‘Personal data literacies’: A critical literacies approach to enhancing understandings of personal digital data. New Media & Society, 21(2), pp. 419–437. https://doi.org/10.1177/1461444818799523 (Accessed: 26 May 2025).
- Quinn, J., Burtis, M.F. and Jhangiani, S. (eds.) (2022). Toward a Critical Instructional Design. Hybrid Pedagogy Inc.
- Selwyn, N. (2022). Education and Technology: Key Issues and Debates. London: Bloomsbury Academic. Available at http://dx.doi.org/10.5040/9781350145573 (Accessed: 26 May 2025).
- Slade, S. and Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas. The American Behavioral Scientist, 57(10), pp. 1510–1529. https://doi.org/10.1177/0002764213479366 (Accessed: 26 May 2025).
- Smyrnaios, N. (2018). Internet oligopoly: The corporate takeover of our digital world. Bingley: Emerald Publishing Limited.
- Stommel, J. (2014). Critical Digital Pedagogy: a definition. Hybrid Pedagogy. Available at https://hybridpedagogy.org/critical-digital-pedagogy-definition/ (Accessed: 26 May 2025).
- Suchman, L. (2002). Located accountabilities in technology production. Scandinavian Journal of Information Systems, 14(2), pp. 91–105. Available at https://aisel.aisnet.org/sjis/vol14/iss2/7 (Accessed: 26 May 2025).
- Tygel, A.F. and Kirsch, R. (2016). Contributions of Paulo Freire for a Critical Data Literacy: a Popular Education Approach. The Journal of Community Informatics, 12(3), pp. 108–121. https://doi.org/10.15353/joci.v12i3.3279 (Accessed: 26 May 2025).
- van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), pp. 197–208. https://doi.org/10.24908/ss.v12i2.4776 (Accessed: 26 May 2025).
- Watters, A. (2021). Teaching Machines: The History of Personalized Learning. Cambridge, MA: MIT Press.
- Williamson, B. (2017). Big Data in Education: The Digital Future of Learning, Policy and Practice. London: SAGE Publications Ltd.
- Williamson, B. and Hogan, A. (2020). Commercialisation and privatisation in/of education in the context of Covid-19. Brussels: Education International. Available at https://eprints.qut.edu.au/216577/ (Accessed: 26 May 2025).