Do I need to learn how to use AI?
May 24, 2024 9:20 AM   Subscribe

So far, I haven't bothered trying ChatGPT or its equivalents at all. I assume that in the time it takes for me to figure out how to tell it what to do (compose an email, write snippets of code etc), I could've done the task at hand myself. Am I wrong? Am I missing out? Do I need to get on board and catch up, or risk being left behind?
posted by wutangclan to Technology (31 answers total) 10 users marked this as a favorite
 
You need to learn what the technology can and can't do, where it is reliable and where it presents complete fiction. Not necessarily because you'll be using it but because a *lot* of the content you will be seeing will be (is) produced by AI and you need to make informed judgements about it.
posted by Tell Me No Lies at 9:25 AM on May 24 [14 favorites]


It's a huge hype cycle right now, and by the time you need to learn how to use it, it will have already been baked in to all the tools you are used to using. Now whether that is a good idea or not is another question. Some really iffy uses are being pushed out there and, like Tell Me No Lies implies maybe it's time to learn when to identify its use and treat with extreme skepticism.
posted by advicepig at 9:27 AM on May 24 [3 favorites]


discernment is perhaps more useful than AI itself, as said above. A lot of people seem to just accept what AI says without doing a modecom of fact checking, which is not very useful.

I enjoy it for help talking through problems on occasion and for specialized functionality like image recognition, but it's not 100% reliable and I always have to keep that in mind.
posted by Alensin at 9:29 AM on May 24 [4 favorites]


There are a lot of ways to answer this. Here is a self-test:

If in 1996, someone asked you about this new-fangled Internet and wondered if they needed to learn it because they have all the information they need and don't need any faster information or if they did, they could go to the library, what would you tell them based on what you know now?

If in 2010, someone asked whether they needed to learn to use a smart phone, what would you tell them based on what you know now?
posted by RoadScholar at 9:35 AM on May 24 [7 favorites]


Do you need to generate "content"? I.e. trite writing whose accuracy is unimportant?

If the answer is no, then you don't need ChatGPT or its ilk.

Also, corporations keep pushing AI on us so the FOMO is unnecessary. One has to actively deny the offers of Gemini, Bing/copilot etc to escape AI, it's harder to escape it than to use it.
posted by splitpeasoup at 9:36 AM on May 24 [13 favorites]


I use ChatGPT for rough drafts of business correspondence and it's fine for that as long as I add my own finishing touches and correct for any weirdness in style or facts. There was no learning curve. Just ask it as if you were asking an entry-level, first-month-on-the-job assistant to write a draft.

HOWEVER, I do not put any information about my organization there. No dues rates, no real product names, nothing that can be used to train it on info about my organization. It can go crawl our website for that.
posted by kimberussell at 9:41 AM on May 24 [3 favorites]


I find it gets me 50% through some annoying tasks. And usually it's the most tedious 50%. And it doesn't take much time to figure out how to get good enough results from models like ChatGPT or Claude.
posted by mullacc at 9:44 AM on May 24 [1 favorite]


If you do end up using AI for any text, keep in mind that it will rewrite based on criticism too. You can respond to what it gives with comments like, "Fine, but make it 30% shorter" or "More professional in tone, please" and it will do that. It's nmot going to make it sound less generic, but it can help you shape the dreck it gives you to being dreck that works for its purpose.

As an austistic person who sometimes comes off as overly weird/idiosyncratic in a staid business environment, it is nice to be able to quickly generate email/correspondence that does not scream Weird Non-Neurotypical Person At Work.
posted by DirtyOldTown at 9:55 AM on May 24 [6 favorites]


I think the answer is really it depends. It's 100% clear that AI will not help me do my job and in fact will only make my job harder/worse. I don't know any scrupulous person doing my career who uses AI.

Will I eventually use it to...I dunno, write an email for me? Come up with a recipe? Plan a vacation? Surely not now, when it's so blindingly clear that much of what AI creates is rubbish. But maybe in the future?

To use RoadScholar's scale, I'd say that I learned to use the internet when it benefited me to do so -- personally and professionally. I switched over from a dumb phone to a smart phone when it benefited me to do so. I don't see why AI has to be any different.
posted by BlahLaLa at 9:59 AM on May 24 [15 favorites]


My only use of ChatGPT so far has been t use it to gin up some Powershell scripts, so for those purposes it has been much quicker than trying to write up the scripts myself, since I am not terribly adept at the task. The ability to just input a natural-language request makes "learning to use" the AI pretty painless, and, since I can find out if the script works or not just by running it and seeing if I get what I want, I don't have to worry about factual accuracy, bland content, or any of those things.

One of the core promises of AI is that you just tell it wht you want it to do. The tools I have seen already come very close to that ideal, so you probably won't have to learn anything at all in short order.
posted by briank at 10:26 AM on May 24


The bad news is that it depends, so we can't tell you. The good news is that it's virtually costless to try. So the next time you want to write something and it would be time consuming, or you're finding it difficult to figure something out using your usual methods, give it a shot for 30-60 seconds and see if it helps. Repeat 3 or 4 times for different tasks. If you don't find it useful or enjoyable, don't do it again.
posted by Mr.Know-it-some at 10:49 AM on May 24 [2 favorites]


I am a lawyer and have played with it to draft some (very, very) basic documents that are non-standard for my work. But nothing technical or regulatory because I know I can't trust it for that. It drafted a letter of intent for a specific kind of non-binding research product and the bones of a website hosting agreement and a short informal bill of sale for an intercompany asset transfer I needed to document and a friendly liability assumption document that was less about legal teeth and more about setting expectations for behavior.

In each case I needed to add, and delete, and edit the output to get to something usable. But in each case a good chunk of what it generated stayed in the final document. If someone without proper legal experience had generated and then presented the document as final each would have been pretty obviously deficient.

I would agree with mullacc that it is a good way to kickstart a document that will be a bit tedious to draft.
posted by AgentRocket at 11:17 AM on May 24 [3 favorites]


I assume that in the time it takes for me to figure out how to tell it what to do (compose an email, write snippets of code etc), I could've done the task at hand myself. Am I wrong?

I wouldn't generally use an LLM to compose an email, unless I needed the assistive power of a machine that produces readable sentences. Some people struggle with general purpose writing like that, so it can be a powerful accessibility tool. Otherwise, it's not great for the task of writing ideal-email-lengh 2 or 3 sentences.

Snippets of code can come out of LLMs in usable form, sometimes, depending on the language and the LLM. OpenAI's is kind of notoriously bad for some languages; I've received a lot of totally useless code that ignores major portions of carefully crafted prompts. Short snippets for small but thorny things can sometimes have better success.

The prompt:result time ratio depends heavily on how efficient you are at a given task yourself. Prompting a commercial LLM-as-a-service is done in natural language: "Write a C program that accepts a filename argument and prints the contents of that file in reverse." or "Summarize the below text for me: [PASTE A BUNCHA STUFF]"

Am I missing out?

I mean, these things can be powerful tools, but it depends heavily on what you need the tool to do. It's important to remember that an LLM, ultimately, is a machine that produces words from your prompt. It's not a machine that produces information. The commercial services want you to forget the distinction, but there is a distinction.

Personally, I think most of these LLMaaS things are kinda dreadful at producing words. Their writing styles, even with tons of prompting to modify, are awful. There's an underlying stink you can't get rid of; you need fine-tuned local models to do it. ChatGPT and its ilk, though, have the advantage of being readily available and easy to access.

Do I need to get on board and catch up, or risk being left behind?

I mean, there's nothing to "catch up" to. If you can ask a person to do something, you can ask a natural language LLM: "Write an email to my boss asking for clarification about why my WFH request was declined." I don't think anyone's leaving you behind unless you're struggling to write a sentence or two of instructions.
posted by majick at 11:28 AM on May 24 [3 favorites]


ChatGPT can be used to summarize long texts that you can’t be bothered to read. AI is also good for generating images to use in business presentations and the like. And it’s somewhat useful to help you solve your crossword puzzle.
posted by shock muppet at 12:28 PM on May 24


I am also a deep AI skeptic, but the one thing I enjoy using ChatGPT and Gemini for is figuring out how to write spreadsheets functions for things that I am 99% sure I can do in a spreadsheet, but can’t quite figure out how to do on my own. You can even say what kind of data you have in which columns and it will spit out a formula for you that does exactly what you were hoping for, which you can then copy and paste into your spreadsheet. Pretty neat!
posted by mostly vowels at 12:55 PM on May 24 [5 favorites]


RoadScholar, the self-test should also include technologies that were heavily promoted but turned out not to meet expectations.

AI has been around in some form for almost 70 years now, so it provides lots of examples:

If in 1985, someone asked whether they needed to learn to use rule-based expert systems, what would you tell them based on what you know now?

If in 1959, someone asked whether they needed to learn to use neural nets ....

etc.
posted by JonJacky at 1:06 PM on May 24 [2 favorites]


You really, really really should only use it cases where you know enough to be able to validate the answers. Do not use for any factual questions that you don't already know the answer - if will be confidently wrong and you won't know. Do not use to find resources - it will make up totally fictitious titles, urls and other citations.

Places where I found it helpful
- absolutely fantastic at writing a memorial poem to honor a relative. It provided about 70% of the final product but I would have never gotten there on my own
- writing website copy where I either give it the details or I'm asking about things where I know enough to check the answers. this is still takes 2-3 iterations but the result is better than doing it 100% on my own. Note that I still edit so it has the right tone but it gets me close. I also used it for a marketing email that I was struggling with where to even start.
- how to do things in Excel when I don't even know the name of the function but only because I can insert the answer in the spreadsheet and confirm that it works

Note that none of those are essential to my life but it has been interesting to experiment.

Things that were a disaster
- I asked it who was in a certain painting of a group of famous people. It gave me an incomplete answer. I asked if a specific person was in the painting and it said absolutely not. It was just wrong since that person was in the painting as a friend was able to demonstrate to my chagrin.
- I asked about the health benefits of magnesium and the results were unreliable with minor mistakes - definitely better to just google it.
- I asked for help with ideas for a paper I was writing for an informal presentation. It was vaguely helpful. I asked for a more detailed writeup (about 2 pages) about one of the themes that it proposed and it included things that were just wrong (like mentioning animals that live in the desert that don't - even though I hadn't even asked it talk about animals.) I asked it for resources on the topic and it gave me a dozen - not a single one was actually a bone fide link to something on the topic.
posted by metahawk at 1:13 PM on May 24 [3 favorites]


I use it a fair bit. I almost never copy any text into any final work product. I don't like the writing, and most often the purpose of writing for me is to record the outcome of some thought process I went through that AI does not replace.

If you're good at something, it will be worse than you.

In my view, the best current general use cases are ideation and brainstorming.

As a rule of thumb, I ask myself "am I using this to think better, or to think less". The former use case is naturally a good one. The latter requires having confidence in the output and a full picture of the risks of errors and is generally not going to be worth your time.
posted by lookoutbelow at 1:29 PM on May 24 [7 favorites]


I’d like to echo lookoutbelow - “if you’re good at something, it will be worse than you.”

So, identify things that you personally find challenging, and use those as potential exploration points for ChatGPT. I don’t use it in my day to day but I now understand ChatGPT and ways I can use it, and probably use it once or twice a week now after starting from zero a few months ago. No regrets spending a few hours tinkering to gain and understanding and familiarity.
posted by samthemander at 1:48 PM on May 24


I use it occasionally at work to produce copy for slide decks, usually for things I understand and don't want to explain again so I put in a few bullets and say "make this into a paragraph with a professional tone". I still have to fix it because it sounds weird but I can spend 30 minutes to an hour writing a paragraph with writer's block or I can have the robot do something that's good enough for government work.
posted by fiercekitten at 4:35 PM on May 24


I think if you aren’t interested in using it or trying it out you aren’t missing out. And as it improves it is less and less likely you’ll have catching up to do - it seems like it gets increasingly easy to use.

My current favorite uses are all the PowerShell scripts that are useful to me but that I do not want to personally write and recipe math.

Over the weekend I took a picture of a recipe from a cookbook and sent it to ChatGPT 4o and asked it to be tripled. I’m presuming the math was correct. The salad dressing turned out fab.

I also have had it cut recipes down as needed and sometimes convert to weights from measures - at this point I trust it implicitly for recipe math.
posted by hilaryjade at 5:53 PM on May 24


How old are you? Or, put another way: how close are you to retirement? I work in IT and have so far avoided any intensive dealings with AI. I'm probably less than 10 years away from retirement, and all I see when I look at AI is party tricks, for the most part, at least at my work. By the time it does anything useful, I will be checked out, so I have made the decision not to give a damn about it, in both my work life and my personal life.

And I don't feel like I'm missing anything by not diving into it in my personal life either. Don't misunderstand; I'm no Luddite, I have plenty of tech in my life. I just...don't feel like I need more? For instance, Gemini installed itself on my Pixel 8 Pro a while back. I used it for like three days and it annoyed me so much after trying it, I removed it because the old-school Assistant is just fine for what I need it for.

YMMV, of course, but if AI is mostly "compose an email, write snippets of code", or things in that vein, well, I've been doing that for literally decades, and I don't really care if something can do it more efficiently. I'm efficient enough at writing an email.

I guess I'm just at my limit for Things I Feel Like I Need To Be Conversant With. The AI train has left the station, and I'm standing on the platform watching it go, feeling perfectly content. Whether that could be you or not, I have no idea, but I will say that I don't feel like I'm missing anything.
posted by pdb at 6:08 PM on May 24 [1 favorite]


a parable: sausage-casing jeans came into fashion when I was out of high school and too confident in my own judgment to be taken in. eventually they were consigned to the dustbin of history by the knowing ones, and I only had to wait twenty years for it to happen, which is nothing.

idiotic and unsatisfying fads in technology are no different. if you used to use a fax machine or a hydrogen-powered airship or a player piano, you may have some nostalgia for the specific shape and feel of those things. but if you never did, you don’t kick yourself now for missing a out on a marvelous window of opportunity. indeed, you are probably considerably more alive and thriving than many who opted in.

using “ai” may be or become a requirement in assorted vile jobs for some unknown but finite length of time. it will never be worthwhile. if you care to wait with resolve, it will go away again. perhaps it will be superseded by something uglier and stupider along the same lines, but nevertheless it will pass out of its current form.

another parable: once I had a job where a consultant came in and talked about “lean six sigma” principles for a while. nobody knew nor cared what that was, and eventually he left. you can outlive so many stupid things. have some faith
posted by queenofbithynia at 7:23 PM on May 24 [6 favorites]


I assume that in the time it takes for me to figure out how to tell it what to do (compose an email, write snippets of code etc), I could've done the task at hand myself.

I used Chat GPT for the first time to help me create my self-evaluation for work. Self-evals at my firm are absolute bullshit that they don't pay any attention to, but they still expect me to "take them seriously."

The evaluation writing style at work is some weird corporatese thing that I only ever use for self-evals, once a year. I have never gotten fluent in writing it. So it's always been a bit of a struggle.

I typed up a list of my accomplishments, goals, areas of improvement in plain English, chucked them into Chat-GPT with the instruction to create a self-eval, and had a long first draft in the style the within seconds. I had to edit for accuracy (some rephrases made my responsibilities sound more encompassing than they actually were), but the style was perfect. It was far faster than me laboriously trying to replicate the stilted style that screams "taking the self-eval seriously."

I got an exceeds expectation on my review for that year, so either no one cared about my self-eval, or no one noticed or cared that it had AI language in it. If there are things that you find tedious and time-consuming to write for stylistic reasons, AI can be faster than doing it yourself.
posted by creepygirl at 10:36 PM on May 24 [3 favorites]


You know how internet search has gone downhill tremendously?

I use ChatGPT to sort through the garbage for me way faster and provide actual individually tailored answers to things like how to approach a coding problem, or “I have a flight out of this airport at 11am Tuesday and I’m staying at this hotel, what are my various options for getting there on public transit, how complicated are they, and when would I need to leave if I chose each one?” It’s not perfect but it’s faster and better than I could do jet lagged.
posted by deludingmyself at 11:02 PM on May 24 [1 favorite]


Simplify. I use ChatGPT to simplify things because I am terrible at it. I'm in accounts receivable and work with salespeople whose reading comprehension can be... questionable. So when I'm trying to explain an accounting process or concept to my sales team, I use ChatGPT. "Explain Accounting Concept/Process X to a salesperson" is incredibly useful. Sometimes I even tell it to simplify its response.

This only works because I know what I'm talking about and can spot where the AI is wrong, and fix it. Does my sales team always understand? Well, I can also ask ChatGPT to explain things to a 6th grader so....
posted by MuChao at 5:18 AM on May 25 [1 favorite]


I've used it a few times for work. Once I needed a write up on why social studies is important in the early childhood classroom. I could have written it. I told ChatGPT what I needed, proofread it and was done in 5 minutes. I also used it to summarize why cooking with preschoolers is beneficial. And a couple other things in that vein. It was basically busy work and I know the material well enough that I was confident I could pick up on any nonsense (there was not, just some repetition).

In my daily life, not so much.
posted by kathrynm at 8:43 AM on May 25


This question stoked my curiosity so I asked my own about what people actually use LLMs for. You may find the answers helpful in deciding whether you want to bother with them.
posted by Tell Me No Lies at 1:32 PM on May 26


Overall, I currently like perplexity.ai better for general web search now, than Google. It's not always correct, but it almost always lists citations/site hits so you can double check it. Plus, you don't need to make an account, it is almost always pretty fast, and no ads.
posted by bitterkitten at 7:31 PM on May 26


Co-Intelligence mentions a few ways to try it out, which echo some of the good suggestions above:
1. Identify "centaur tasks" and "cyborg tasks"; namely, stuff you can let the robots do part of the work, and then you pick it up vs. stuff that you need to work closely with your robot partner to get satisfactory results.
2. Just try it and see what (if any) of the drudgery it can take off of your plate. Keep in mind this is probably the worst AI you'll ever use, so don't be shy about trying again in a couple months.
3. To get the best results, you should offer context and constraints in your prompts; there's some colorful examples where the author mentions having 3 distinct AI "personae" review the work for different things (technical detail or layman readability, etc.) But you can always say "you are a , trying to write a prompt to get What specific elements would you highlight in the prompt? Can you give me ten sample prompt that a could use to get ?"

As folks mentioned above, eventually this stuff will be baked into everything, but right now it's like being the G1 Android phone user, or the Gopher/Lynx/BBS internet user.... you might not need it bad enough to climb up the learning curve.

posted by adekllny at 6:07 PM on May 27


I've used Gemini a little bit. It's been useful for suggesting language to use in personal notes, for example for expressing sympathy or concern, which is not something I'm accustomed to writing..

On the other hand, I've asked it for specific information, and it's always delivered hallucinations to some extent. I've learned to recheck.
posted by JimN2TAW at 6:52 PM on May 29


« Older Am I being irrational about fear of being let go...   |   Ok, is *this* worth a long commute? Newer »

You are not logged in, either login or create an account to post comments