PWR Approach to Artificial Intelligence

LAST UPDATED: July 19, 2023
BY: Stuart A. Selber

Artificial intelligence (AI), especially generative platforms like ChatGPT for words and DALL-E for images, are readily available to college students and are no doubt being used in Program in Writing and Rhetoric (PWR) courses—English 4, 15, 30, 137H, 138H, 202. Teachers need to be able to think judiciously about what generative AI can and can’t do for student writers and about how our community should approach it as a literacy technology.

For our purposes, a fundamental thing to know is that artificial intelligence can generate the types of content we ask students to produce in PWR assignments. It can produce entire assignments, such as argumentative essays, infographics, cover letters, and proposals, and it can produce parts of assignments, such as descriptions and extended definitions. It can also be used in nearly any phase of the writing process, from research and invention to drafting and revising.

Teachers would benefit from learning a bit about how AI works (be sure to watch the video). You don’t need to become a technical expert, but understanding basic mechanics can help you understand problems and possibilities for student writers.

A key insight is that AI produces mathematical responses to human communication problems; these responses are based on statistical probability. So the responses are unlikely to fully satisfy the rhetorical dimensions of our assignments, which is good news. Another insight is that AI responses are based on two things: the corpus on which a robot was trained, and the prompts students use to elicit information from a robot. We have no control over a training corpus, but it’s useful to recognize it as a social construction rather than a neutral or comprehensive whole. We have more control over AI input and can help students learn how to effectively prompt and re-prompt robots. It’s a safe bet that we’ll be teaching “prompt engineering” in the not-too-distant future in PWR courses.

This document supports our ongoing conversations in this quickly evolving space. It includes philosophies and practices from the PWR office and links to additional resources.

NOTE: There are seemingly endless resources on AI in education, and the landscape of what’s available is overwhelming and not always on target for our program. So the links here are curated and meant to be limited in size and scope. If you have an excellent resource that fits our agenda, let me know and I’ll add it to this document.

Realities

  • There are hundreds of free AI and AI-assisted platforms available on the internet. Here are thirty-three examples.
  • Penn State enterprise systems already include artificial intelligence, including Office 365, Google Workspace, and the Adobe Creative Cloud Suite. In these popular systems, students in PWR courses are heavy users of Microsoft Word, Google Docs, and Adobe In-Design.
  • When students create a new document in the latest version of Google Docs, they immediately see a prompt saying “Help me write.” Clicking on this prompt activates a writing robot in the writing space.
  • Turnitin, our institutional plagiarism checker, includes AI detection capabilities (at this point, Penn State hasn’t activated them), and a future version will include a “draft coach” that plugs into Microsoft Word for the web and Google Docs.
  • Instructure is exploring ways to integrate AI into Canvas. It’s coming for both students and teachers. We will likely see AI features for course development. They already exist in other learning management systems.

Problems and Possibilities

  • Problems (selected). For writers in our program, AI can generate inaccurate information, make up information (hallucinate), automate work that requires human judgment, neglect current events, and reinforce bias and discrimination. There are many other problems for society in general, such as the ability to use AI to spread political disinformation.
  • Possibilities. This is for us to decide. We need to investigate thoroughly and thoughtfully what can be productive about integrating AI into PWR courses. As a program, we already integrate word processing programs, graphic design programs, search engines, communication programs, and many other literacy technologies. To what ends should we employ AI?

PWR Stance

The PWR assumes a post-critical stance when it comes to literacy technologies, including generative artificial intelligence. What does this mean? People who take a post-critical stance make the following assumptions:

  • Technology adoption and implementation is a fact of life in educational settings, for better and worse (this does not mean anything is inevitable, by the way).
  • The educational technologies we adopt and implement reflect the perspectives of their designers.
  • Those perspectives make strong suggestions for use but do not preordain use.
  • A variety of social and institutional forces help to determine the nature and character of educational technology implementations.
  • The same educational technology can come to mean different things to different people in different teaching and learning contexts.
  • A key objective for teachers is to investigate possibilities, identify problems, and promote a classroom in which technology is treated as both an educational subject and a platform for work.

As such, a post-critical stance toward artificial intelligence acknowledges that its capabilities can be a mixed bag for students and teachers and that there will be plenty of nuance to navigate. In addition, it also recognizes that the expertise of writing teachers is invaluable to understanding artificial intelligence as a writing platform, positioning us as campus leaders with relevant knowledge to contribute.

Plagiarism

AI promises to exacerbate plagiarism problems unless it’s accommodated thoughtfully. By plagiarism I’m referring to cheating, in which students knowingly hand in the work of a robot as their own, and to instances of so-called “inadvertent plagiarism,” in which students do not really know how to incorporate the work of a robot into their writing. Teachers themselves are struggling with how to think about AI-generated text as source material for writers. This will be an important topic of conversation in our program.

One complicating factor has to do with copyright. In ChatGPT, for instance, users own the copyright to both the input, their prompts, and the output, robot responses. This makes sense in that AI companies want users to feel free to use the content produced by robots. Students may very well own the copyright to what robots produce for them.

Keep in mind that this is a legal distinction, not a policy (or even ethical) distinction. Just because students own the copyright to a text doesn’t mean they can use it in just any situation. A historical example: For many years now, we’ve not permitted students to reuse work (“self-plagiarism”) to earn credit more than once without permission from the current teacher. There are limits to what copyright ownership buys students in academic settings.

Obviously, plagiarism is an old problem in writing programs, but there does seem to be something qualitatively different when AI is involved. To prove academic misconduct, a usual part of the process is finding the plagiarized texts. To find plagiarized texts, teachers often use the very technology students used to cheat in the first place: Google—or another search engine. With AI, however, there’s no plagiarized text to find. Although some studies have concluded that robots themselves can plagiarize, the issue here has to do with what counts as proof for the College of Liberal Arts Academic Integrity Committee.

At this point, anyway, it’s important to realize that plagiarism detection software is unreliable and unlikely to provide the evidence you need to prove academic dishonesty. Time will tell what the future holds for the accuracy of plagiarism detection software.

So what to do? Ironically, perhaps, your best approach for new forms of AI is to use our established methods for plagiarism prevention and detection. Effective assignment writing is good pedagogy and a fine preventive against plagiarism. Beyond that, Andrew Peck, Senior Director of Academic Integrity for the College of Liberal Arts, is encouraging us to have thoughtful conversations with students whose work evokes suspicion. If those conversations result in actionable information, he wants teachers to initiate the academic integrity process. The College provides advice on how to talk to students about suspected assignments.

More generally, continue to pay careful attention to end-to-end processes for student writing. It’s difficult for students to cheat if assignments are framed rhetorically and their teachers are working as involved coaches. This has always been and continues to be the case, no matter the literacy technology.

Of course, don’t hesitate to ask the PWR office for help in navigating suspicions or actual misconduct cases. We’re here to assist you with the complexities of AI.

Starting Points

  • Make an account at OpenAI and try out ChatGPT. You need to have a sense of how AI works in order to think about problems and possibilities. Ask the robot about things you know a good deal about. This will make it easier to evaluate the responses. Here are some ideas for more advanced prompts.
  • To your syllabi, add a Penn State statement on AI and plagiarism. If you are in a 602 for English 15 or 202, the standard syllabus you’re using should already include a statement.
  • Read this brief article on AI literacy. Although written for a K-12 setting, it sets the stage well for program conversations.

Pedagogical Practices

Our agenda for the 2023-24 academic year will involve developing concrete pedagogical practices for both incorporating and interrogating AI as a writing platform. This may very well include modifying some of our current standard assignments and adding new assignment options.

In the meantime:

  • Relook at the workflow for your assignments. What are the steps students need to take to get from start to finish? The goal is to have a well-defined workflow with concrete deliverables for each step. The steps should probably include approving topics (i.e., require students to get your permission to take on a given writing task in response to an assignment), reviewing invention or planning work (perhaps require students to turn in working documents along with “final” assignments), conducting peer reviews, tracking draft development, and reviewing revision plans. Involved teachers might not read every word in every step of an assignment, but they’re keenly aware of what students are doing, how they’re doing it, and why they’re doing it. They’re guiding the entire assignment workflow as something of a project manager.
  • Add a metacommentary to each assignment. As part of a final submission, ask students to write and append a reflection that explains their idea development and rhetorical decision-making in response to the proposal you have approved. This could well include how they used AI (and other literacy technologies) as a writing aid—for better and worse. We will work on making adjustments to the PWR curriculum to account for the additional labor of adding a metacommentary to each assignment. We will also develop instructions for writing metacommentary that can be added to assignment sheets.
  • Archive the deliverables for each step in an assignment in Canvas. Create dropboxes for lists of topics, invention or planning work, drafts for peer reviews, peer-review responses, revision plans, metacommentaries, and the rest. This could also include the input (prompts) and output (robot responses) from student sessions with AI robots.

Penn State Resources

Our institutional website on AI includes a useful section on Frequently Asked Questions. Not everything is relevant to the PWR program, but taking time to understand the bigger picture can only help you navigate this tricky terrain as a Penn State teacher.

External Resources

 

Print Friendly, PDF & Email