cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Community Champion

Amazon has no responsibility for how it's tools are used

Amazon has been promoting itself recently.

 

"Among the tools it is offering, Amazon’s image recognition product is the most controversial. For a per-minute fee of a few pence, Amazon Rekognition can scan video footage and, for example, pick up people’s faces that can then be checked against a client’s database.

 

"Civil rights groups have called it 'perhaps the most dangerous surveillance technology ever developed', and called for Amazon to stop selling it to government agencies, particularly police forces. City supervisors in San Francisco banned its use, saying the software is not only intrusive, but biased - it’s better at recognising white people than black and Asian people.

 

"Mr Vogels doesn’t feel it's Amazon’s responsibility to make sure Rekognition is used accurately or ethically.

 

“'That’s not my decision to make,' he tells me."


............
This message may or may not be governed by the terms of
http://www.noticebored.com/html/cisspforumfaq.html#Friday or
https://blogs.securiteam.com/index.php/archives/1468
3 Replies
Contributor I

Re: Amazon has no responsibility for how it's tools are used

Amazon is not the only one who has tools like that.

 

The Boston marathon bombing used Israeli based facial recognition tools to match the bombers movements.

 

Casino's use facial recognition software with capabilities to match facial criteria against databases of banned players or people.

 

Airports use software of this type.

 

Is this good ? yes and no, because there are aspects in which you want to be able to do this, but it can very easily go from a tool to assist, to a tool to harm the innocent and a huge big brother is watching me aspect.

 

Is the fact that Amazon is trying to sell this tool any different then what Google did with the Chinese gvmt, or the collection of data from end users on thousands of different softwares ? I think not.

Sincerely,

Mike Glassman, CISSP
Iguana man
Community Champion

Re: Amazon has no responsibility for how it's tools are used


@MikeGlassman wrote:

Amazon is not the only one who has tools like that.

 

The Boston marathon bombing used Israeli based facial recognition tools to match the bombers movements.

 

Casino's use facial recognition software with capabilities to match facial criteria against databases of banned players or people.

 

Airports use software of this type.

 

Is this good ? yes and no, because there are aspects in which you want to be able to do this, but it can very easily go from a tool to assist, to a tool to harm the innocent and a huge big brother is watching me aspect.

 

Is the fact that Amazon is trying to sell this tool any different then what Google did with the Chinese gvmt, or the collection of data from end users on thousands of different softwares ? I think not.


All here understand the pros and cons of the surveillance technologies, but I think that the questions that should be asked are these:

1. Should these tools be in the open market?

2. Should they be allowed for use proactively in public spaces for any purpose other than security?

3. If they are permitted only for security-related activities, should they be restricted for use only as forensic tools after the fact?

 

Consider that if they are freely available for unrestricted use, we can surrender all hope of retaining whatever little privacy is left to us.

If the trend of unrestricted sales and usage of these tools continues, pretty soon the municipalities will be selling CCTV footage for profit.

Contributor I

Re: Amazon has no responsibility for how it's tools are used

Although I agree with the sentiment behind what you wrote, I think we all need to accept that it really doesn't matter whether we agree or do not agree (I am very much pro privacy by the way), whether this sort of technology needs to be in the open market,because they already are.

 

Nothing we do or say will change that, and any limitations that are put on the sale will simply create a secondary market (as we have seen in every single case of limiting technology) with tools that have much fewer controls on them.

 

I very much agree that they should not be allowed to be used proactively, but then, if the police use this technology to search for subjects of interest, and such a person is apprehended due to the use of this technology, but it turns out to be a false lead... is that ok ?

 

This is an issue that I have problems with. This is not a security issue, it is a criminal issue. I have the same issue with the use of fingerprints against a database of fingerprints that is not designed (the criteria for the creation of the database does not state this is a use) in order to find people, whether criminal or not.

 

My biggest fear is the misuse of these technologies by so called do gooders, whether they are security or police of other.

 

And finally, once a tool is approved for use, you have no way to actually ensure that it is used only for what is was approved for, and very quickly, the approval will change to allow for wider aspects of use.

 

Some people may think I am being overly paranoid, but there have been many issues that prove me right, and not just in third world countries, or China and Russia, but in the US, Israel, Germany, France and many more countries.

Sincerely,

Mike Glassman, CISSP
Iguana man