Minnesota

AI generated ads in Eros and MNEteeth_smile
Tango123144 14 Reviews 986 reads
posted

Many companies stock performance sky-rocketed 2023-24 due to the future of AI. Artificial Intelligence not only give the wrong profile ad in Eros but also collect and store all verification information to sell it to dark sites I assume. Eros ads are so fake that its not even funny. I assume from two of my encounters with Brazilian and Russian provider that MN is getting the scrap from the bottom of the barrel with fake ads and selling to dark sites perceptional based.  MNE also puts somewhat outdated and photoshopped pictures.  Some AI collect verification picture and reverse search it in internet to find information matching from social media platform I anticipate worrisomely. 411 should spit out a centralized and encrypted verification for client and provider and protect us from AI takeover.

but nothing has changed really, except the wave of f***ing agencies who are using unrecognizable numbers and random names and pics plucked out of the blue, especially on Eros. People must be responding or the trend wouldn't have lasted.  
As for data collecting/processing, of course that existed long before AI, though AI might give it a bit more zing by enabling marginal criteria that used to not be worth pursuing.  
Bottom line:  
pics mean nothing if they are new and you can't connect or recognize the woman, even if they make you very hard.
phone numbers that don't have a credible history or someone on the other end who can verify themselves/their agency first (underlined) are the last phone numbers I will give any info to.  
So nothing has changed. I don't use 411 either.  I do rely heavily on the women (and the men) I know. It's all a hustle; the fact that there are some really lovely people hustling makes it a lot of fun, while the scammers add just a little spice.

Some AI generated photos comes with malware, so if you get a photo from someone, you wont be able to determine file extension as effectively as an IT expert, and if you download that cute picture from ad or person, you now have malware to your phone, AI now expanded the horizon ten folds or more by hiding the file extension effectively.

Seeing someone just based off of a few pictures of a hot babe remains a risky proposition, regardless of whether or not it was created by AI.

"Some AI collect verification picture and reverse search it in internet to find information matching from social media platform I anticipate worrisomely". The most tortured sentence ever written.

Translation:  

They take a picture that you send for "verification" and run it through a site like Tineye (reverse image search) to find info on you.  Then use the information to blackmail you.  

this has occurred LONG before AI.

Yes it started and now it is automated through AI- and cloud-based and more reach than ever before….

Well, if you are going to send them a picture, use a picture taken just for that purpose that you will never use anywhere else and has nothing in the background that gives away anything about you.

I'm going to start using AI on my pics. Oh boy I'm handsome.

Would guess cheaper then Aldo.👍

...Aldo is deaf and blind by this point, and yet there are still ads on Eros with his name on them.

Register Now!