Human Generated Data

Title

Untitled (group of women standing on stage with nun under flags)

Date

c. 1912-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6031

Human Generated Data

Title

Untitled (group of women standing on stage with nun under flags)

People

Artist: Durette Studio, American 20th century

Date

c. 1912-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6031

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 89.5
Door 88.2
Person 85.9
Military 78
Military Uniform 72.5
Shop 58.5

Clarifai
created on 2019-11-16

people 99.7
one 95.9
monochrome 95.5
adult 94.3
man 92.3
woman 89.7
movie 89.7
outfit 87.5
portrait 87.2
room 87.2
wear 86.7
vehicle 85.4
administration 85.3
child 84.8
street 84.8
military 83.2
group 82.8
two 82.7
television 82.6
home 82.1

Imagga
created on 2019-11-16

pay-phone 96.8
telephone 87.1
electronic equipment 61.2
equipment 47.6
pump 24.7
call 24
gas pump 21.4
door 20.1
business 16.4
man 16.1
box 14.8
industry 13.7
cable 13.3
center 13.2
mechanical device 12.9
phone 12.9
architecture 12.5
people 12.3
computer 12
technology 11.9
old 11.8
station 11.6
urban 11.4
connection 11
communication 10.9
building 10.7
person 10.6
communications 10.5
server 10.5
device 10.3
work 10.2
network 10.2
power 10.1
city 10
male 9.9
information 9.7
digital 9.7
public 9.7
indoors 9.7
men 9.4
security 9.2
data 9.1
modern 9.1
industrial 9.1
rack 9
worker 8.9
database 8.9
metal 8.8
interior 8.8
working 8.8
sliding door 8.8
switch 8.8
electrical 8.6
empty 8.6
mechanism 8.6
storage 8.6
travel 8.4
design 8.4
black 8.4
electronic 8.4
street 8.3
one 8.2
room 8.2
transportation 8.1
steel 8
cluster 7.8
factory 7.7
construction 7.7
machine 7.4
service 7.4
safety 7.4
occupation 7.3
open 7.2
home 7.2
job 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.7
clothing 95.3
black and white 93.2
person 91.1
street 79.2
footwear 75.2
coat 63.4
man 54.4
monochrome 53.4
door 52.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 3-9
Gender Female, 54.8%
Surprised 45.2%
Fear 45.2%
Angry 45.1%
Disgusted 45%
Sad 46.3%
Calm 53%
Happy 45%
Confused 45.1%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Angry 49.5%
Happy 49.5%
Fear 49.9%
Disgusted 49.6%
Sad 49.6%
Calm 49.6%
Surprised 49.7%
Confused 49.5%

AWS Rekognition

Age 12-22
Gender Female, 50.1%
Sad 50.2%
Confused 49.5%
Fear 49.7%
Happy 49.5%
Surprised 49.5%
Angry 49.5%
Calm 49.5%
Disgusted 49.5%

AWS Rekognition

Age 30-46
Gender Male, 50.5%
Calm 49.7%
Confused 49.7%
Happy 49.6%
Surprised 49.6%
Sad 49.7%
Fear 49.7%
Angry 49.6%
Disgusted 49.5%

AWS Rekognition

Age 12-22
Gender Male, 50.4%
Confused 49.5%
Surprised 49.5%
Sad 49.7%
Calm 49.9%
Disgusted 49.6%
Happy 49.5%
Fear 49.6%
Angry 49.7%

AWS Rekognition

Age 22-34
Gender Male, 50.2%
Confused 49.5%
Calm 49.8%
Sad 49.7%
Surprised 49.6%
Happy 49.5%
Disgusted 49.6%
Fear 49.7%
Angry 49.6%

AWS Rekognition

Age 30-46
Gender Female, 50%
Fear 49.5%
Calm 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 50.3%
Confused 49.5%
Surprised 49.5%
Angry 49.5%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 85.9%

Text analysis

Google

MY
MY