WN #18

How Not to Use AI

How Not to Use AI

I recently came across an article from The Atlantic that struck me with its headline: “The AI Industry Is Stuck on One Very Specific Way to Use a Chatbot.” Intrigued, I opened it to find that it was referring to the ubiquitous presence on AI platforms of the suggestion to use generative AI for trip planning, something I have noticed myself. The point of the article seems to be that, even though this is cited as a common usage, it isn’t that useful yet. But while in some senses that may be true, the methods used by the author in arriving at that conclusion have shortcomings of their own. In critiquing a specific way the AI industry wants you to use a chatbot, the author inadvertently highlights a specific way those new to AI tend to use chatbots, which hinders their effectiveness for planning that trip, or anything else.

If you expect AI to give you the perfect response on the first try, you are going to be disappointed (image created with Ideogram.ai)

The author begins by asking AI (in this case, Microsoft’s Copilot), to plan a “perfect day” in Los Angeles after telling it she had “one day in town to explore the sights.” She mentions that she lives in LA, so when the chatbot produces a completely unrealistic itinerary, whisking her around from place to place with seemingly little understanding of the time it actually takes to get between those places in a sprawling, traffic-congested city, she has no problem recognizing its flaws. This is actually a great strategy for testing out a chatbot’s capabilities.

Tip #1: When trying AI tools, test them on things you are already very familiar with. This makes it much easier to get a sense of their strengths, limitations, and biases.

So far, so good. The problem comes when the author simply dismisses the response as unhelpful, and uses it to make her point that, despite AI platforms pushing the travel planning use case, their chatbots actually aren’t very good at it. But this is a key flaw in how inexperienced users often interact with AI chatbots – dismissing the initial output as unhelpful and using that to draw conclusions about its capabilities.

Tip #2: Treat AI as you would a student. Sure, it would be nice if it nailed everything perfectly on the first try. But AI isn’t human, and doesn’t always intuitively understand things in the same way a human would, just as a student may not always understand things in the same way you do. If a student doesn’t seem to get it at first, do you give up and say they’ll never get it, or do you try to rephrase, elaborate, and provide scaffolds to help them get there? It’s important to understand that using AI is an iterative process. If it struggles with the initial prompt, that doesn’t mean it can’t be helpful, you may just need to adjust the prompt or provide additional context to get what you are looking for. And the more you work with AI the better you understand what it needs to help it work more efficiently and effectively. Sound familiar?

So what could the author have done differently here to get a more helpful output? Again, put on the teacher glasses and see what it seems to be struggling with. In this case, it’s clearly struggling with the time factor. It doesn’t seem to understand LA traffic, nor does it seem to understand that the author may actually want to spend a little bit of time at each place rather than simply traveling from point to point on a map to check them off a list. To address this, you can either edit the initial prompt, or write a follow-up prompt:

“Be sure to consider traffic and driving distances.”

“Limit activities to no more than 3 per day”

“Allow for 2-3 hours at each location”

“Minimize the amount of time spent traveling from place to place” 

These simple follow-ups, or edits to the initial prompt, take just seconds to add, but can vastly improve the output.

Tip #3: Human + AI is better than just AI. The author has context that a machine doesn’t. As a living, breathing human being, she understands the logistical hurdles of getting around large cities. She has also traveled before and has an idea of what a “perfect day” of sightseeing looks like based on her preferences and past experiences. But the chatbot was lacking that context on the first go-round. That’s ok. See its needs, adjust.

There are other options here too, besides just adding instructions or context in search of a better version of the original output. For instance, you can take what AI did give you, and work with that:

“Center the day around the Santa Monica pier. What else is there to do in that specific area?”

“What are some things to consider if I decide to visit Hollywood/Beverly Hills?”

“Which of the places you mention give the best opportunities for awesome photos?”

Tip #4: Dig in. One of the best parts about using generative AI is that you can expand on any part of the output. You might treat the initial output as no more than a menu to figure out where you want to take the conversation. Maybe the overall plan isn’t helpful, but focusing in and asking questions about a specific part of the output might be.

Having deemed Copilot’s attempt to create an itinerary a failure, the author jumps over to a different AI chatbot, ChatGPT, to see if it can be more helpful. Her prompt to ChatGPT adds a little more context, mentioning that she’s a “huge foodie” and asking for an itinerary that accounts for this. She is once again disappointed when the chatbot tells her she might try a Michelin-starred restaurant for dinner, without saying which one. Just another example, she points out, of AI falling short of being helpful.

In this case, she is likely using the free version of ChatGPT, which doesn’t have access to the live internet, and relies on training data that cuts off in early 2022. It’s entirely possible that the vague suggestion is a function of ChatGPT trying to avoid giving out-dated information. A simple follow-up prompt asking for specific Michelin-starred restaurants based on its most updated information would have gotten her the information she needed. She also could have just stuck with Copilot (the chatbot she used for her first prompt), which uses the same (or better) GPT model but is connected to the live internet via Bing search. Or she could have used Perplexity, another free AI chatbot, which allows you to specify the sources it uses to craft its responses, including foodie favorites Yelp and Reddit. This would have been my go-to if asking about Michelin-starred restaurants.

Tip #5: Understand the strengths, limitations, and features of the tools you are using, and learn about a diverse array of tools so you can turn to the best one for your specific request. No tool is perfect, but where one falters another may shine.

Of course, at any point you can switch back to traditional research and planning as well. Maybe ChatGPT gives you the idea to have brunch at a beachside cafe and dinner at a Michelin-starred restaurant, at which point you jump back into Google or use whatever method you normally would when trying to find a place to eat.

Tip #6: Understand the point at which it may be better to take over the reins yourself. When I first started using AI, I would sometimes spend large amounts of time trying to iterate and get AI to perfect its output, before realizing that the initial response got me 80% of the way, and it would have been way quicker just to tackle the remaining 20% myself.

When we use AI with the expectation of perfection, it’s easy to be disappointed. But when we view it as a collaborative tool – a partner in the process – there is much more value to unlock. The industry may be fixated on getting you to use a chatbot to help plan your next trip, but it’s not going to be very helpful until we get over our fixation with AI being an all-knowing answer machine. “I can’t emphasize this enough,” says an employee of an AI company interviewed for the article, “These kinds of tools are meant to supplement, not supplant, our decision-making process.”

ACT Exam-Takers Provide Insight on Student AI Use

Students who took the ACT exam last June were surveyed about their AI use, the results of which have been compiled into a recently released report. I popped the 38-page report into the free preview of Google’s new Gemini 1.5 AI model, notable for its ability to digest huge amounts of text (like a 38-page report). Given that the survey can correlate student responses with their scores on the ACT, I asked Gemini to pull out key insights specifically related to student performance. Here are some key findings (straight from AI, but verified with my own two eyes):

  • Higher-scoring students are more likely to use AI tools.

  • Among the reasons for not using AI tools, lower-scoring students are more likely to cite lack of access and knowledge as barriers, whereas higher-scoring students are more likely to report lack of interest.

  • Lower-scoring students have higher expectations for AI tools' impact on persistence and critical thinking.

  • Higher-scoring students are more likely to recognize limitations of AI tools.

  • Lower-scoring students are more likely to consider using AI tools for college admissions essays, though the overall number of students considering this use case was very low.

Of course, last June is forever ago in AI terms, but the overall insights from the survey are still valuable for educators as they try to grasp how AI tools are being used and perceived by their students.

AI for School Leaders

The example above demonstrates a key AI use case for school leaders: reviewing and analyzing large documents. It’s #6 on this list of 13 Things School Administrators Can Try with AI. Of course, school leaders will not only want to familiarize themselves with some of these tools and use cases, they are also tasked with promoting the productive use of AI campuswide. For that, Dr. Catlin Tucker offers 3 key tips:

  • Think about how AI can support existing school initiatives

  • Invest in professional learning that leverages AI in service of strong pedagogical practices

  • Encourage a culture of experimentation and iteration

Students Using AI in Groundbreaking Ways

While many educators are still grappling with what to make of AI, students are already using the technology in incredible ways. From AI tutoring tools and postpartum depression chatbots to autonomous robot bikes and Wizard Chess, here are the cool things students (as young as 15!) are building with AI.

📌 Claude 3 is Here!

Anthropic’s Claude has long been the most underrated AI chatbot, in my opinion. If you haven’t used it, now is a great time to check it out with the release of the newest version, Claude 3. The free version of Claude 3 far outperforms the free version of ChatGPT. It’s also lightning quick and includes image recognition and file upload features. I have yet to use the paid version, Claude 3 Opus, but there are many reports that it surpasses the industry-leading GPT-4 on several benchmark tests.

📌 MagicSchool Introduces MagicStudent

Educator-centric AI platform MagicSchool has released MagicStudent, which gives students access to AI tools designed specifically for them, such as a writing feedback generator, a college & career counselor, or a language tutor, all with built-in safety guardrails. The student tools are controlled from the teacher dashboard, letting teachers choose which tools students have access to and when, as well as the ability to view how students are using the tools.

📌 Acrobat Adds a New Trick

Adobe Acrobat has added a new AI assistant for PDF files that can give you summaries, analysis, insights, and more. Subscribers can now access the tool for free on the desktop and web versions of Acrobat (just look for the “AI Assistant” button). This will eventually be a paid add-on subscription, so try it out while you can. This feature adds to a growing number of ways to access AI for PDFs (see “The Lab” section below).

5 FREE ways to use AI with PDF files

Copilot - Open a PDF in Microsoft Edge browser (the default browser for many Microsoft devices) and click on the Copilot logo in the top right of the browser window.

Gemini - Go to Settings → Extensions → Turn on “Google Workspace”. This will allow Gemini to read files (including PDFs) that you provide from your Google Drive. Just type the “@” symbol, select “Google Drive,” and then paste the file name or link (make sure the file isn’t “restricted” in the sharing settings). Last I checked, this only worked with personal Google accounts, not school ones … yet. Alternatively, you can try a free preview of Gemini 1.5 Pro, which just has a button to attach a file (this is my go-to at the moment!)

Claude - Attach a file using the paperclip icon. You can upload up to 5 files at a time with a maximum of 10MB each.

Perplexity - Click the “Attach” button to upload up to 3 files per day.

Adobe Acrobat - See the “Bulletin Board” section above. If you don’t have a subscription to Adobe Acrobat, you can still try the “AI Assistant” tool for free by signing up for a 14-day free trial, which includes free access to the AI tool.

Note: These work best with actual PDF documents, not images of documents that are saved as PDFs (like a scan, for instance). You’ll need to treat those as images when it comes to AI, not PDFs (some tools are still able to pull out the text from an image, a future tutorial perhaps).

More Ideas…

There are lots of great ways to incorporate AI art generation* into the classroom. Here are a few ideas (ELA-centric but may spawn some ideas in other subjects as well!):

* If you are looking for a free AI art generator, there are lots. Here’s a handful: ImageFX, Designer, Imagine, Ideogram

That’s all for this week! If you appreciate the content in this newsletter, consider subscribing for free, or sharing with people in your network who may find value in it. If you are looking for more, feel free to check out the archive to catch up on any editions you may have missed.