Human Generated Data

Title

Man Reads in Destroyed Control Room

Date

1991

People

Artist: Steve McCurry, American born 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Steve McCurry, 2017.254

Copyright

© Steve McCurry

Human Generated Data

Title

Man Reads in Destroyed Control Room

People

Artist: Steve McCurry, American born 1950

Date

1991

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Steve McCurry, 2017.254

Copyright

© Steve McCurry

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Human 99.4
Person 99.4
Person 98.6
Nature 91.8
Apparel 77.1
Clothing 77.1
Wood 70.8
Earthquake 59.4
Demolition 55.8
Person 53.8

Clarifai
created on 2018-04-06

people 99.4
industry 98.7
waste 98.6
calamity 98.1
adult 96.2
garbage 96.1
commerce 95.4
one 93.6
market 92.7
stock 92.4
business 92
recycling 91.6
grinder 91.2
accident 90.3
chaos 89.2
man 89
room 88.3
employee 88.1
hurricane 87.5
trash 87.1

Imagga
created on 2018-04-06

cockpit 46.4
computer 34.2
equipment 33
center 30.3
technology 27.4
laptop 21.3
working 21.2
work 21.2
business 17.6
sequencer 17.5
man 17.5
device 17.4
hand 16.7
office 16.4
monitor 16.1
apparatus 15.5
industry 15.4
digital 14.6
people 14.5
engineer 14.2
network 13.5
electronic equipment 13.1
board 12.8
control 12.6
machine 12.4
system 12.4
desk 12
data 11.9
communication 11.7
panel 11.6
adult 11.1
display 10.2
city 10
school 9.9
person 9.5
mixer 9.5
sitting 9.4
electronic 9.3
smile 9.3
connection 9.1
circuit 8.8
home 8.8
screen 8.7
smiling 8.7
support 8.6
development 8.6
male 8.5
power 8.4
building 8.2
keyboard 8.1
information 8
server 7.9
professional 7.9
printed 7.9
switch 7.8
technical 7.7
central processing unit 7.7
hardware 7.7
cable 7.6
web 7.6
house 7.5
console 7.5
room 7.4
street 7.4
notebook 7.3
global 7.3
industrial 7.3
road 7.2
science 7.1
worker 7.1
job 7.1
interior 7.1

Google
created on 2018-04-06

scrap 61.2
waste 59.5

Microsoft
created on 2018-04-06

working 68.4
pile 45.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-38
Gender Male, 93.9%
Sad 37%
Disgusted 0.3%
Calm 53.8%
Happy 1%
Surprised 1.6%
Angry 3.5%
Confused 2.7%

AWS Rekognition

Age 26-43
Gender Male, 53%
Happy 46.8%
Confused 45.1%
Sad 50%
Disgusted 45.8%
Angry 46.1%
Calm 45.9%
Surprised 45.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Text analysis

Amazon

CCC
DO:
0:
00o0o
0o
888