A.I. is B.S.

It's... complex.

Short answer: Build a PC that can handle one of the smaller large language models (the requirements to run a local copy of ChatGPT are EXTREMELY high so that's off the table). A local copy of a large language model like maybe Llama 3 or something along those lines. AI derived voice profile to sound like EDI (can be cloned from the massive amount of speaking lines EDI has in the Mass Effect games). A hell of a lot of fine tuning to get the responses you would expect from the chat assistant. Voice to text integration with room microphones and then strategically placed speakers. Integration with smart home programs that can control lights, displays, silly dioramas, etc.

I've seen a couple people doing similar projects and all the tools are out there nowadays. This is still a massive undertaking and will be something I work on for quite a while.

I know people here aren't very keen on AI but this is simply a personal project for fun that won't be allowed access to the internet. Think, a customized version of Alexa or Siri.
That sounds like an intense but fun project. I might wait for more tools to come along that can bridge some of those steps but I could see myself swinging big like that 20 years ago.
 

figmentPez

Staff member
I don't know if it's A.I. itself so much as what it's used for.
Also, the source information for a lot of A.I. is a problem. The plagiarism machines were trained on a lot of data they should not have used, and no matter what they're used for in the future, they're still built on stolen property, and quite frequently spit out ideas nearly wholesale.

Also, there are problems with how A.I. is used. A recent study on the use of A.I. in diagnostics found a major flaw: Humans are likely to just agree with whatever the computer says, regardless of if it's correct or not. Since this type of A.I. can't provide reasoning for the conclusions it reaches, this provides a huge problem if humans can't be relied upon to disagree when it's warranted.

In addition, A.I. is such nebulous term that many technologies that are not large language models, or whatever the image equivalent is (is there a better, non-perjorative, term for this class of machine learning/logic?), are getting lumped in. Not every photoshop filter is "AI". Not every upscaling method is AI.

Lastly, there's the energy usage, but that's a large scale problem, and mostly ties back to problems with what A.I. is being used for (i.e. so many things that it's not actually well suited to).
 
Also, there are problems with how A.I. is used. A recent study on the use of A.I. in diagnostics found a major flaw: Humans are likely to just agree with whatever the computer says, regardless of if it's correct or not. Since this type of A.I. can't provide reasoning for the conclusions it reaches, this provides a huge problem if humans can't be relied upon to disagree when it's warranted.
This is actually why I want to heavily restrict or even block my AI assistant's internet access except maybe to site I whitelist myself (weather, stocks, things like that). If you carefully control what it uses for reference you can get significantly better (more accurate) results. Reddit will be instantly blacklisted lol.
 
I bought a few boxes from Uline a few months ago. Now I get their huge catalog. It feels like i'm in the 70's/80's again. I kind of like that there's still companies that put out large physical catalogs to just browse through.
that said...this is the "A.I. is B.S." thread...

20240917_124545.jpg


I dunno what it is about the color, composition, and lighting of AI-generated images, but I'm starting to just instantly get a feel that something is AI-generated before I even nitpick over the defects/clues.

But, yeah...that bear's paws definitely have the hallmarks of ai generation. And that thermos has a weird handle thing on it. Which annoys me, because they didn't even do the basic cleanup that *I* do on an AI image, and I don't get paid for doing it.
 
Top