It'd be great if GPT could provide it's sources for the text it generated.
I've been asking it about lyrics from songs that I know of, but where I can't find the original artist listed. I was hoping chat gpt had consumed a stack of lyrics and I could just ask it, "What song has this chorus or one similar to X..." It didn't work. Instead it firmly stated the wrong answer. And when I gave it time ranges it just noped out of there.
I think If I could ask it a question and it could go, I've used these 20-100 sources directly to synthesize this information, it'd be very helpful.
To answer the question above, these systems cannot provide sources because they don’t work that way. Their source for everything is, basically, everything. They are trained on a huge corpus of text data and every output depends on that entire training.
They have no way to distinguish or differentiate which piece of the training data was the “actual” or “true” source of what they generated. It’s like the old questions “which drop caused the flood” or “which pebble caused the landslide”.
> Their source for everything is, basically, everything. They are trained on a huge corpus of text data and every output depends on that entire training.
Bing chat is explicitly taking in extra data. It's a distinctly different setup from chatgpt.
I've been asking it about lyrics from songs that I know of, but where I can't find the original artist listed. I was hoping chat gpt had consumed a stack of lyrics and I could just ask it, "What song has this chorus or one similar to X..." It didn't work. Instead it firmly stated the wrong answer. And when I gave it time ranges it just noped out of there.
I think If I could ask it a question and it could go, I've used these 20-100 sources directly to synthesize this information, it'd be very helpful.