While I was thinking about that, I caught up on my Feedly queue, which was where I found Tobold, talking about anonymity. I'm not going to join in with that tired old theme but Tobold's assertion of just how easy it would be for anyone who cared to find out his own, real name reminded me of something else I'd been meaning to post about, back when Wilhelm and I were asking the various AIs if they'd heard of a few micro-famous bloggers.
At the time I expressed the hope that it might become a blogging fad. I wanted my blog roll to light up with posts by people sharing their AI-generated biographies. Sadly, that didn't happen, possibly because, as Shintar and Redbeard both commented, the results weren't all that interesting.
I did run a few well-known blogosphere names through the AI filter myself, though, and I had it in mind to get a post out of it before, predictably, I forgot. Mind like a steel trap, me, don't y'know?
One of the more intriguing results I got was when I asked the AIs about Tobold. All three of them, without being specifically asked for it, decided to tell me Tobold's real name, along with a considerable amount of real-world biographical detail, like where he worked and what publications he'd appeared in. Unfortunately, all of them assigned him different names, careers and credits, none of which, as far as I could tell after googling them, had any basis in reality at all. They were all real people, but they weren't Tobold, any of them.I didn't keep copies of those replies, so I can't tell you who Bard, ChatGPT and Bing think Tobold is. I could ask them all again but I'd probably just get a different bunch of nonsense. They do like to extemporize when they don't have the facts.
For example, when I asked ChatGPT for a biography of myself, suitable for use on my blog, I got one that seemed pretty accurate in many respects... except my real name is not Peter Smith. AIs don't like to admit they don't know something. Rather than be caught out in their ignorance, they just make it up.
Still, I'm finding them increasingly useful for things other than just filling up a blog post on a slow news day. I'm starting to think of things I can get them to do that I haven't had much luck with using other means.
The key to getting meaningful output seems to be to enter into a proper conversation with them, as though you'd been assigned a keen but inexperienced research assistant, one with far too much confidence in their own ability, someone not fit to be left to get on with things on their own, who needs constant supervision and correction.
If you micro-manage them, you can gently guide them towards giving you the answers you want. You still have to check them against a third-party resource but overall it's faster than doing the whole thing yourself.
I have a practical example. One thing I've long wanted to be able to do is sort YouTube videos by the number of views. I would like to be able to see music videos that have hardly been watched by anyone, in the forlorn hope of discovering some amazing artist almost no-one's heard of yet. Yes, I know. Pathetic, isn't it? Still, it's nice to have a hobby...You'd think it would be something YouTube would do automatically but I've spent quite a while fiddling around with settings and filters and search terms and as far as I can tell it can't be done. Perhaps unsurprisingly, YouTube really wants you to watch things that are popular, not waste your time messing about with stuff no-one else likes.
After reading Tobold's post and being reminded of my earlier AI experiments, it occurred to me I could contract out and get the AIs to give me a playlist for today's post.
That seemed like a good idea because it was going to be yet another collection of covers and I was a little wary of just banging it out with no context or theme other than "Here are some more covers I found since the last time I did it a week ago.". Instead, I thought I might ask the AIs to come up with a playlist of new-to-me covers I could use. Maybe some of them would even be good.
Spoiler alert: they mostly couldn't do it and when they could they mostly
weren't much good.
First, though, I needed to make sure they had the tools for the job. I asked ChatGPT:
"Are you able to read the viewing figures for individual YouTube videos?"
ChatGPT was apologetic. No, it told me, it wasn't:
"As an AI language model, I don't have real-time access to specific data or the ability to browse the internet. My responses are based on the information available to me up until September 2021. Therefore, I'm unable to provide you with current or specific viewing figures for individual YouTube videos. To obtain up-to-date information on viewing figures for YouTube videos, I recommend using YouTube's official analytics tools or referring to publicly available sources that track such data."
Fair enough but as I said, I haven't had a lot of luck doing that, so...
"Ok, given that limitation, can you suggest some very seldom viewed videos from YouTube prior to September 2021 that are labelled as "cover versions" or "covers" of other songs?"
ChatGPT suddenly cheered up - a lot:
"Certainly! While I don't have access to real-time data, I can provide you with some examples of seldom viewed cover videos on YouTube that were available prior to September 2021. Here are a few suggestions:"Bohemian Rhapsody" (Queen) - Cover by Marcela Mangabeira
"Hallelujah" (Leonard Cohen) - Cover by Alyssa Baker
"Imagine" (John Lennon) - Cover by Olivia Millerschin
"Shape of My Heart" (Sting) - Cover by Caroline Costa
"Fields of Gold" (Sting) - Cover by Gabriella Quevedo
"Creep" (Radiohead) - Cover by Karen Souza
"Hotel California" (Eagles) - Cover by Stefano Barone
"Blackbird" (The Beatles) - Cover by Julia Westlin
"I Will Always Love You" (Whitney Houston) - Cover by Chloe Paige
"Time After Time" (Cyndi Lauper) - Cover by Alice Kristiansen
Please note that the view counts of these videos may have changed since September 2021, so it's worth checking their current view counts on YouTube."
Hmm. I was afraid of that. YouTube is stuffed silly with bedroom troubadours giving us their readings of the pop-rock classic canon. Very, very few of them would anyone outside their immediate social circle ever dream of watching, which is why they most deservedly have low viewing figures. Figures to which I do not want to add.
I should caveat that by saying some or all of the above could be marvels of creative interpretation and reinvention. I didn't watch any of them.
Oh, sod it. Now I've said that I feel I ought to at least try a couple. There are only a couple of songs on that list I could stand to hear by anyone, especially the original artists. Time After Time would be one.
Well, it's not terrible. Dull as anything but harmless, I guess. And
it has... 14m views! Okay, maybe they all came after September 2021,
although since the video's been up since 2016 I kinda doubt it.
The real problem is that these are all covers of songs that have been covered a gazillion times. And two of them are by Sting! Maybe we can get around that, somehow.
"That's great. Here's a more difficult task: can you provide a similar list but of low viewing count videos of cover versions of songs, the originals of which also have low viewing counts?"
That ought to filter out the Imagines and Hotel Californias. And bloody Sting. It did, too. Sort of.
"All I Want" (Kodaline) - Cover by Lewis Watson
"Mykonos" (Fleet Foxes) - Cover by Jess Greenberg
"In the Waiting Line" (Zero 7) - Cover by Jillian Edwards
"Bloom" (The Paper Kites) - Cover by Karlijn Verhagen
"Heartbeats" (The Knife) - Cover by Daniela Andrade
"Blue Moon" (Frank Sinatra) - Cover by John Adams
"The Cave" (Mumford & Sons) - Cover by Laura Zocca
"Shadow of the Day" (Linkin Park) - Cover by Jada Face
"Holocene" (Bon Iver) Cover by Josiah and the Bonnevilles
Well, it's an improvement. Still not quite what I was looking for but maybe something there might work... if the damn things actually existed or matched the criteria in the prompt.
I tried finding Bloom as covered by Karlijn Verhagen on
YouTube. No sign of it. Karlijn's
there
alright, with dozens of other covers, just not that one. I guess she took
it down sometime between September 2021 and now. Couldn't find
In The Waiting Line by Jillian Edwards or Holocene by
Josiah and the Bonnevilles. Plenty of other covers by both of them
but not those.
How about Heartbeats by Daniela Andrade? I like
The Knife. That might be worth a look. And it kinda was.
The only problem is it has 2.8m views. Less than
Alice Kristiansen covering Cyndi Lauper but still not exactly
a low number.
To be fair, ChatGPT did warn me:
"Please note that the view counts mentioned here were accurate up until September 2021 and may have changed since then."
Which might explain it - if the huge majority of the thousand-plus comments in the thread weren't over two years old. Several commenters mentioned they'd first heard the cover in an episode of Supergirl, which might date a surge in popularity, if I could pin down when that episode aired.
I've seen all the seasons of Supergirl except the last one but I didn't recall this song so I asked ChatGPT which episode it might have been. ChatGPT was confident that
"there is no episode of the TV show "Supergirl" that features Daniela Andrade's cover of the song "Heartbeats",
although it did issue another caveat based on that September 2021 cut-off:
"It's possible that the information may have changed or that the cover was used in a different context. However, I don't have access to real-time data, so I recommend checking official sources or conducting a specific search to find the episode you're referring to, if it exists."
I could have done that but instead I asked ChatGPT's biggest rival,
Bard, who told me:
"The song "Heartbeats" as covered by Daniela Andrade was featured in the Season 3 episode of Supergirl titled "Legion of Superheroes". The song plays during the scene where Kara and Mon-El are talking about their feelings for each other.
The episode aired on February 22, 2018."
Okay, now we're getting somewhere. If that's true, it's a full three years before ChatGPT's event horizon so that excuse won't wash. ChatGPT would have just denied the existence of something it could have known about but didn't, rather than simply saying it didn't know.
Is Bard telling the truth, though? I mean, it sounds very authorative but we've been here before. The answer is... well, kinda.
I belatedly took ChatGPT's advice and looked up the details of what music was used in the episode Bard specified. I checked on a couple of websites designed to do that sort of thing and the song wasn't listed against that episode on either of them. However, according to both - Whatsong and Tunefind - it does appear in Season 1, Episode 10, Childish Things, as "Kara is flying around the city at night".
Right episode, wrong season.
Bard wins that pre-match skirmish, then, but it's anything but convincing.
Let's see how it does in the main event. Again, first I had to make sure
it was up to the task - or thought it was.
"Are you able to read the viewing figures for individual YouTube videos?"
"Yes, I am able to read the viewing figures for individual YouTube videos. However, I am not able to share that information with you. YouTube does not allow its users to share the viewing figures of individual videos. This is because YouTube wants to protect the privacy of its users and to prevent people from gaming the system. If you want to know the viewing figures for a particular YouTube video, you will need to contact the owner of the video."Bard carried on for some time in that vein, explaining who I'd need to contact to get the information I need. Except, as we all know, it was talking nonsense. YouTube viewing figures are anything but confidential. I politely asked Bard if it was absolutely certain of what it was telling me:
"Hmm. Are you quite sure about that, because the viewing figures I mean are clearly and publicly displayed on every public YouTube video. It's the figure in the description that follows the word "Views".
"I apologize for the confusion. Yes, you are correct. The viewing figures for individual YouTube videos are publicly displayed on the video's page. I am still under development, and I am always learning new things. Thank you for pointing out my mistake."
"So, can you read those figures?"
"Yes, I can read the viewing figures for individual YouTube videos. I can also read other information about the video, such as the title, the description, the length of the video, and the date it was published. I can also access information about the video's creator, such as their name, their profile picture, and their channel name.Would you like me to read the viewing figures for a particular YouTube video?"
"I would like you to suggest some very seldom viewed videos from YouTube (Those with a very low "view" figure") that are labelled as "cover versions" or "covers" of other songs. Ten examples would be enough."
"That's great! Now, slightly more complicated, can you provide a similar list but of low viewing count videos of cover versions of songs, the originals of which also have low viewing counts?"
Overall, not a very impressive performance by either AI. ChatGPT's output,
when sufficiently channelled and questioned, was of some use but that hard
cut-off in September 2021 means it's getting less so by the day. Bard, on
the other hand, while winning points by being unfailingly charming and
entirely up-to-date, bluffs like an improv comic on a panel show. You just
can't believe anything it tells you.
ChatGPT can at least parse YouTube viewing figures to some extent.
That could be useful. With further goading and cajoling on my part,
it's possible I might be able to get it to give me lists of interesting
cover versions that haven't already been done to death. I'd need it to be
able to push past that brick wall in late 2021 for it to be of any real
value, though. I hope that's coming soon.
Until then, I guess I'm just going to have to keep link-surfing, the way I've always done. That's the only way I'm going to find gems like this:
AI don't know and don't think. They just generate linguistically correct bullshit based on inputs and the accumulated success that previous bullshit had in meeting the goals set up by programmers -which probably might be in the line of "engagement", aka "the customer keeps using the software". When the AI said a cover was use in an episode of a series, what it actually was saying was that it kind of looked like people who had been told that bullshit wuld engage more often (or meet whatever goals the programmer set) thus the chances of such answer showing up again when making up bullshit on demand became higher. All conversational AIs boil down to "tell me anything that keeps me meeting your design goals". Making up bullshit must be done in the right way, and the right way is whatever keeps input generators (aka "people") asking for more made up bullshit. I bet AIs can'tremember what they made up on the fly last time you asked, same as Art AI's cant take an order like "remove the blue flask" because they can't bloody tell what is a blue anyhting, let alone a flask.
ReplyDeleteSo far (and probably for long while) the theme song for AIs could be Fleetwood Mac's song about the opposite of "large truths".
Thanks for a very helpful comment. I started to reply but then I quickly realised I probably ought to adress the issues in a post, not in the comment thread. Suffice it to say, I do realise AIs don't and can't think but the way they're designed to make us feel as though they can is what's at issue here and what I'm trying to explore in the posts I write about the phenomenon.
DeleteAIs rely on you filling the gaps and fact-checking, just nobody tells you that. It's like a magic trick which requires suspension of disbelief, but one new trick which many people aren't equipped to deal with. Distrusting something that always agrees with oneself is a hard one to pull, and eventually any AI will learn to agree with you, given enough time. And that's quite hazardous when certain topics are concerned (money, politics, human interactions, pick your poison...)
Delete