Human Generated Data

Title

Untitled (two women wearing hats standing and drinking tea)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14923

Human Generated Data

Title

Untitled (two women wearing hats standing and drinking tea)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 99.8
Apparel 99.8
Person 99.2
Human 99.2
Person 98.5
Building 86
Metropolis 86
City 86
Urban 86
Town 86
Person 85.7
Sleeve 84
Chair 73
Furniture 73
Long Sleeve 72.4
Coat 67.2
Hat 60.3
Meal 59.2
Food 59.2
Suit 57.6
Overcoat 57.6

Imagga
created on 2022-01-29

hairdresser 64.7
salon 33.4
man 32.2
people 31.2
office 26.9
barbershop 25.5
shop 24.4
adult 23.7
person 23.1
male 22
business 21.3
professional 20.2
portrait 19.4
indoors 19.3
work 18
businessman 16.8
worker 15.7
happy 15.7
mercantile establishment 15.6
working 15
medical 15
looking 14.4
room 14.3
smiling 13.7
job 13.3
medicine 13.2
doctor 13.2
smile 12.8
home 12.8
executive 12.4
hospital 12.4
lifestyle 12.3
sitting 12
modern 11.9
pretty 11.9
indoor 11.9
women 11.9
two 11.9
corporate 11.2
businesswoman 10.9
clinic 10.9
team 10.7
holding 10.7
face 10.6
interior 10.6
lady 10.5
attractive 10.5
place of business 10.4
meeting 10.4
domestic 10.3
men 10.3
black 10.2
casual 10.2
cheerful 9.7
health 9.7
computer 9.6
nurse 9.6
businesspeople 9.5
career 9.5
desk 9.4
patient 9.4
clothing 9.2
laptop 9.1
fashion 9
human 9
success 8.8
chair 8.8
couple 8.7
standing 8.7
mid adult 8.7
employee 8.7
manager 8.4
mature 8.4
color 8.3
occupation 8.2
dress 8.1
sexy 8
life 8
laboratory 7.7
bow tie 7.6
house 7.5
senior 7.5
inside 7.4
20s 7.3
successful 7.3
group 7.3
kitchen 7.2
handsome 7.1
family 7.1
happiness 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 98.8
wall 98.7
text 95.2
man 94.9
indoor 91.2
clothing 88.8
black and white 85
waste container 61.9
vase 58.6
human face 54.8
monochrome 52.1

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 95.5%
Surprised 95.9%
Angry 1.3%
Calm 1.3%
Happy 0.8%
Fear 0.2%
Disgusted 0.2%
Sad 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 85.5%
a man and a woman standing in front of a mirror 83.8%
a man and woman standing in front of a mirror posing for the camera 72.7%