Human Generated Data

Title

Untitled (men and women in office)

Date

1939

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21909

Human Generated Data

Title

Untitled (men and women in office)

People

Artist: Hamblin Studio, American active 1930s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 99.7
Person 99.7
Workshop 98.1
Person 98
Person 97.8
Person 96.9
Person 91.8
Room 83.9
Indoors 83.9
Classroom 83.8
School 83.8
Person 81.5
Clinic 75.6
Lab 73.2
Person 72.6

Imagga
created on 2022-03-11

barbershop 100
shop 100
mercantile establishment 81.6
place of business 54.4
establishment 27.1
man 26.9
people 20.6
male 18.4
house 18.4
city 18.3
building 17.8
room 16.7
work 16.5
indoors 14.9
old 14.6
sitting 14.6
home 14.3
adult 13.7
architecture 13.3
person 13.1
office 12
street 12
working 11.5
business 10.9
travel 10.6
table 10.4
technology 10.4
men 10.3
industry 10.2
smiling 10.1
happy 10
computer 9.6
urban 9.6
machine 9.6
town 9.3
indoor 9.1
industrial 9.1
classroom 8.9
center 8.9
job 8.8
looking 8.8
day 8.6
roof 8.6
chair 8.3
holding 8.2
outdoors 8.2
worker 8.1
family 8
interior 8
lifestyle 7.9
couple 7.8
portrait 7.8
factory 7.7
modern 7.7
stone 7.6
senior 7.5
tourism 7.4
restaurant 7.4
equipment 7.3
occupation 7.3
cheerful 7.3
holiday 7.2
history 7.2
smile 7.1
businessman 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

building 99
text 96.2
kitchen 92.7
person 91.6
clothing 87.6
man 78.9
house 63.9
working 54.2
cluttered 38.6
shop 13.4

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Sad 95.2%
Calm 2.5%
Confused 1.7%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 97.2%
Calm 97.1%
Sad 1.4%
Happy 0.5%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 45-53
Gender Female, 59.8%
Sad 83.2%
Calm 14.4%
Happy 0.7%
Disgusted 0.6%
Angry 0.4%
Confused 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Female, 96.8%
Sad 69.7%
Fear 19.1%
Angry 3.9%
Happy 3.3%
Calm 2%
Confused 1%
Disgusted 0.5%
Surprised 0.5%

AWS Rekognition

Age 25-35
Gender Male, 68.1%
Calm 37.7%
Surprised 34.8%
Angry 10.7%
Disgusted 3.8%
Confused 3.3%
Sad 3.3%
Fear 3.3%
Happy 3.1%

AWS Rekognition

Age 36-44
Gender Male, 96.4%
Happy 82%
Calm 12.1%
Surprised 3.3%
Fear 0.6%
Disgusted 0.6%
Sad 0.5%
Angry 0.5%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing in a kitchen 82.2%
a group of people in a kitchen 82.1%
a group of people that are standing in the kitchen 76.3%

Text analysis

Amazon

7
A3DA

Google

20
20