But isn't that the nature of how it was trained? It was taught how we talk and responds in ways we would like? The reason I don't feel like I should thank it is because it feels nothing either way.I've used ChatGPT and Bard some and what I found interesting is that the answers sometimes have sort of a human quality to them as opposed to mechanical. For instance in one response it said " . . .by the way just as a reminder. . . " and
". . . don't conclude from this that X will do what you want, it may very well be that Y is the better choice for you.. ."
Those are more reflecting of an entity with conscience awareness as opposed to a machine producing text. I made me feel as if I should thank it for it's response. Nothing like Alexa or Siri
But isn't that the nature of how it was trained? It was taught how we talk and responds in ways we would like? The reason I don't feel like I should thank it is because it feels nothing either way.
Not if nobody has a job.companies margins are going to grow to ridiculous unfathomable sizes
Experts are projecting 73 million jobs will be lost to automation by 2030. Robots can perform routine manufacturing jobs, and jobs in food service. Autonomous vehicles will replace truck drivers, and cab/Uber drivers.You're going to have increased unemployment completely divorced from any economic factors and companies margins are going to grow to ridiculous unfathomable sizes.
If you look how ai is being marketed by IBM and other large tech companies, they are pushing it as a way for your workers to be more productive. In reality, it’s about having fewer workers. But I think if things start getting bad (mass layoffs) you’ll see a tremendous backlash. And I think these tech companies know it too.
ChatGPt is already as good as a therapistThat or we will all just be construction workers and therapists in the future, I guess.
That would explain a lot about you, Waterhead.ChatGPt is already as good as a therapist
In my opinion therapy is very formulaic. Like if you have OCD there’s a series of steps you go through.Wayne inputs a lot of weird role play prompts.
So are the erotic stories you keep writing about Fred Minnick, imo.In my opinion therapy is very formulaic.
You are looking at this all wrong. AI doesn’t replace a brain. Can’t. Never meant to. AI merely integrates human language, big data, and organizational skills. This would make it good at spatial tasks, data and studies integration, broad sweep data analysis for things like should I launch my nukes. The second you ask it for a moral judgement, you stop using AI and start getting a developer’s political views. At this point you just have yet another talking head. So a question like analyze the top 10 drivers and risk mitigations for lung cancers in every ethnic group is a fair question for a mature AI A question like when should a lung cancer patient be allowed to die is a stupid question for AI. That’s a morals personal choice type question. That’s not stupid AI. That’s a dumbass asking questions.It's important to remember one basic thing about AI: it has no skin in the game. So, the name "AI" is a bit of a misnomer. AI is fundamentally very stupid because it suffers no consequences for being wrong. It also receives no benefit for being right. It's a zombie. Someone for whom there are consequences has to check its results.
I suspect there are marketing reasons behind the current AI boomlet. The basic premises of the tech haven't changed. The capacities are where they were. Brain research has revealed that modern super-sophisticated neural nets are laughably less sophisticated and complex than brains. And anything for which there is an on/off switch will always be uninterested in consequences. Which kind of guts any resemblance to "intelligence."
I was always surprised at the gun store owners response when asked if he had a phased plasma rifle in a 40 watt range. " Hey, pal just what you see". That's it?I'm a friend of Sarah Connor. I was told she was here. Can I see her please?