The FTC says it believes Facebook misled developers and the public over security by failing to adequately review and verify applications.
So-called Verified Apps were given a visible badge, check-mark and a higher profile in search reqults, and were billed as offering a higher level of security for users thanks to a more detailed inspection of security.
“The Application Verification program… is designed to offer extra assurances to help users identify applications they can trust – applications that are secure, respectful and transparent, and have demonstrated commitment to compliance with Platform policies,” Facebook claimed at the time.
But, says the FTC in a report, ‘verified’ apps were nothing of the sort, with apps given the green light in return for a $375 payment: no further testing was ever carried out.
“Contrary to the statements set forth in Paragraph 46, before it awarded the Verified Apps badge, Facebook took no steps to verify either the security of a Verified Application’s website or the security the Application provided for the user information it collected, beyond such steps as it may have taken regarding any other Platform Application,” it reads.
“In truth and in fact, as described in Paragraph 47, in many instances Facebook has permitted a Platform Application to display its Verified Apps badge when its review of the application’s security has not exceeded its review of other Platform Applications.”
The Verified Apps program was closed in December 2009, after just six months, with Facebook explaining that it planned to extend the ‘verification’ to all Facebook apps. At that point, says the FTC, 254 apps had been verified.