Human Generated Data

Title

Untitled (men playing cards)

Date

1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1652

Human Generated Data

Title

Untitled (men playing cards)

People

Artist: John Deusing, American active 1940s

Date

1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.8
Person 99.8
Person 99.4
Person 99.4
Person 99.3
Restaurant 95.9
Person 89.6
Cafeteria 86.3
Person 85.3
Shop 80.8
Building 79.6
People 71.6
Food 69.3
Meal 69.3
Nature 68.3
Lab 66.5
Clinic 62.7
Pharmacy 57.9
Cafe 56.7
Crowd 55.9
Apparel 55.8
Clothing 55.8
Person 45.4

Imagga
created on 2021-12-14

counter 43.5
man 39
people 28.4
male 28.4
office 27.5
business 26.1
adult 25.9
person 25.9
professional 25.5
work 24.3
sitting 23.2
computer 21.6
laptop 20.9
happy 20.7
working 20.3
smile 19.9
modern 19.6
men 18.9
smiling 18.8
businessman 18.5
looking 18.4
building 17.9
manager 17.7
worker 17.1
executive 16.6
couple 16.5
women 15.8
indoors 15.8
window 15.8
meeting 15.1
interior 15
technology 14.8
corporate 14.6
success 14.5
portrait 14.2
job 14.1
groom 14
together 14
hospital 14
home 13.5
television 13.5
room 13.4
case 13.1
lifestyle 13
occupation 12.8
senior 12.2
mature 12.1
suit 11.7
passenger 11.6
table 11.4
company 11.2
two 11
happiness 11
handsome 10.7
teamwork 10.2
communication 10.1
successful 10.1
businesswoman 10
face 9.9
hand 9.9
team 9.8
family 9.8
human 9.7
leader 9.6
talking 9.5
car 9.3
equipment 9.2
holding 9.1
supermarket 9
cheerful 8.9
new 8.9
group 8.9
discussion 8.8
desk 8.7
mid adult 8.7
businesspeople 8.5
clinic 8.4
black 8.4
attractive 8.4
house 8.4
color 8.3
fun 8.2
mercantile establishment 8.2
indoor 8.2
science 8
salon 7.9
medicine 7.9
love 7.9
space 7.8
shop 7.7
elderly 7.7
boss 7.6
serious 7.6
doctor 7.5
grocery store 7.3
student 7.2
monitor 7.2
employee 7.1
medical 7.1
architecture 7

Google
created on 2021-12-14

Black 89.6
Coat 89.5
Black-and-white 83.5
Suit 75.6
Monochrome photography 72.5
Monochrome 72.1
Tableware 72.1
Room 70.8
Hat 67.9
Table 67.5
History 62.5
Vintage clothing 61.5
Shelf 58.9
Art 58.1
Aircraft 57.6
Aviation 56.6
Photographic paper 54.8
Desk 52.8

Microsoft
created on 2021-12-14

text 99.7
window 97.2
person 93.9
indoor 91.4
man 90.2
clothing 67.4
old 63.7
black and white 54

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Male, 97.3%
Calm 93.7%
Sad 4.1%
Happy 0.8%
Angry 0.5%
Confused 0.5%
Surprised 0.2%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 28-44
Gender Female, 64.3%
Calm 52.7%
Sad 43.6%
Angry 1.1%
Confused 1%
Fear 0.6%
Happy 0.5%
Surprised 0.4%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people sitting in front of a window 73.8%
a group of people standing in front of a window 73.7%
a group of people in front of a window 73.6%