Lancaster University Management School - Scholarship and Innovation in Management Education

The cyborg student Artificial Intelligence (AI) is expected to transform labour markets (Felten et al., 2023), which are increasingly cyborg in nature (cf. Haraway, 1987; Haraway and Wolfe, 2016). Embracing the rise of AI, job-seeking students focus on what Generative AI can do for them while overlooking what AI might do to them and their career prospects. A cyborg is “a cybernetic organism, a hybrid of machine and organism” (Haraway and Wolfe, 2016: 7). From this purview, modernity is populated by “theorised and fabricated hybrids of machine and organism”. The lives of HE students are no exception. Little (nothing?) in their journey is accomplished without some form of ‘machine’: from applying to university, to interacting with lecturers and classmates, to crafting assignments, students’ practices are inescapably intertwined with internet, smartphones, apps and other gadgets and tools. The same can be said of educators (note where you are reading this piece). We are cyborgs teaching cyborgs. This cyborg nature of social relations is not static. As mature technologies become mundane, new technologies develop, raising new challenges in the relationships they mediate (cf. Leszczynski, 2020). The development of new generative-AI applications is exemplary, with much debate flourishing about its implications on sports, management, neuroscience, education, health care and more (Freitas, 2023; Chui et al., 2023). On an optimistic note, Korst and Puntoni (2023) pose that AI is a “capable co-pilot” as it can “[boost] productivity and [free] us for tasks requiring our judgement, emotional intelligence, or other complex decision-making”. Sounds great, doesn’t it? The reality in the classroom is a bit more complex.assists the interviewer and interviewee. Cyborgs teaching cyborgs I develop my teaching and learning activities aiming at helping students reach higher levels of understanding (Biggs, 1996), weaving theoretical reflection in practice-based assignments where students analyse real-case companies, identify challenges and propose paths for action. However, since some students are outsourcing much of this work to AI, they are uncritically ignoring the processes that would lead to learning. The experiment I teach business-to-business (B2B) marketing at a stage in the undergraduate degree when most of the cohort is concerned with job markets and are workskill oriented. To cater for these aims, I adopted an assessment-for-learning approach (Brown et al., 2013) and asked students to craft a (very short) white paper – a key piece of writing in B2B markets – for a market of their choosing. They should conduct secondary data research, identify challenges for the market in relation to the United Nations Sustainable Development Goals (SDGs), and propose recommendations. Thus, this assignment provides the opportunity for students to develop critical thinking, argumentation, communication, research, and decisionmaking skills that are invaluable for a career in marketing. Determined to ensure students take this opportunity, and aware of the widespread use of AI amongst our cohorts, I asked students to employ AI at the various stages of crafting. Students were also asked to end the paper with a paragraph reflecting on their interaction, and to identify and defend the value of their work in comparison to AI. Lessons and paths forward Most students were able to write a few sentences on where their unique contributions lie, but this was because of what I found to be the most valuable part of this experiment: the discussions during assignment support sessions. These sessions allowed me to confront students with fundamental questions about their role in job markets where AI may replace many of the activities currently undertaken by market graduates. We as educators play a key role in helping students harness their cyborg nature ethically by discussing AI ‘out in the open’. Upon reading students’ reflections, I found three lessons. First, we need to develop activities that push students to think critically and relationally on what knowledge and creative skills produce irreplaceable outputs. Students are outsourcing work to AI while ignoring how this practice is superseding their own learning process, and ultimately what contributions they will be able to offer. Second, degrees must be designed to transform students into lifelong learners so they can navigate turbulent and everchanging cyborg societies. AI is the latest major technological disruption; it is neither the first nor the last, so students are likely to face these challenges again. Third, students must learn to set ethical boundaries. In the experiment, students were more concerned with Turnitin results, and the impact on their grade, than with authorship and academic integrity (see Lindebaum and Fleming, 2023). This is not to say students are unethical by default. However, educators have a role to play in facilitating learning that prevents students from overlooking the multiple effects of the cyborg nature of societies. While AI may be a “capable copilot”, students must learn to become a capable pilot (Korst and Puntoni, 2023). 31 Scholarship Matters Considering the widespread (mis)use of AI tools in Higher Education, how can educators promote criticality and ethical practice?

RkJQdWJsaXNoZXIy NTI5NzM=