Photo storage app, Ever, has been found secretly using customers’ private snaps to train a commercial recognition system.

What started as a cloud storage app in 2013, Ever the photo storage app, swiftly converted its business to become a facial recognition software venture in 2017. However the app failed to inform the millions of users of the change. It was discovered that the images of users were being utilised to instruct an algorithm to identify faces.

The photos users share on Ever, are being used to train Ever AI, the company’s facial recognition system. Ever then offers to sell the technology to law enforcement and the military. Essentially they are building surveillance technology.

Miju Han, director of product management at HackerOne said:

“Users need to consent to be training data. Faces are one of the most personal things we have, and faces can have legitimate reasons for not wanting to be included in surveillance training systems.”

Ever declines allegations that it has deceived its users, stating that there is a facial recognition disclosure within the privacy policy. However the disclosure is hidden in a cryptic line stating: “Your files may be used to help improve and train our products and these technologies.” After being contacted by NBC News, the company added a line explaining that the products could include “enterprise face recognition software”.

“Unless they go through the terms of service in detail, they do not have guarantees about how their data is used,” said Han.

NYU law professor, Jason Shultz said:

“They are commercially exploiting the likeliness of people in the photos to train a product that is sold to the military and law enforcement.

“The idea that users have given real consent of any kind is laughable.”

Doug Aley, Ever’s CEO, stated that Ever AI does not share any personally identifiable information or photos about users with its facial recognition customers.