Human Generated Data

Title

Untitled (elderly women wearing hats taking desserts from buffet table)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14921

Human Generated Data

Title

Untitled (elderly women wearing hats taking desserts from buffet table)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99
Human 99
Person 98.9
Person 96.2
Dish 93.6
Food 93.6
Meal 93.6
Clothing 88.8
Apparel 88.8
Dessert 70.9
Cake 70.9
Icing 70.9
Creme 70.9
Cream 70.9
People 63.6
Table 62.3
Furniture 62.3
Advertisement 62.1
Photography 60.9
Photo 60.9
Linen 59.1
Home Decor 59.1
Poster 57.4
Hat 55.7

Imagga
created on 2022-01-29

man 34.3
people 32.3
person 32
salon 31.3
male 24.8
home 23.9
adult 23.2
professional 21.5
indoors 20.2
kitchen 17.9
patient 17.7
happy 17.5
smiling 17.4
senior 16.9
medical 16.8
men 15.5
looking 15.2
lifestyle 15.2
domestic 14.8
doctor 14.1
room 14
worker 13.9
cheerful 13.8
sitting 13.7
smile 13.5
work 13.5
women 13.4
nurse 13.4
indoor 12.8
health 12.5
interior 12.4
hospital 12.3
couple 12.2
shop 12.2
office 12.1
portrait 11.6
house 10.9
clothing 10.8
barbershop 10.6
medicine 10.6
mature 10.2
clinic 10
dress 9.9
team 9.9
lady 9.7
job 9.7
working 9.7
business 9.7
together 9.6
table 9.6
two 9.3
face 9.2
drink 9.2
alone 9.1
coat 9
fashion 9
black 9
mercantile establishment 8.9
family 8.9
standing 8.7
instrument 8.6
husband 8.6
loving 8.6
talking 8.6
modern 8.4
old 8.4
holding 8.3
occupation 8.2
human 8.2
love 7.9
cooking 7.9
food 7.9
happiness 7.8
uniform 7.7
surgeon 7.7
life 7.7
drinking 7.7
illness 7.6
equipment 7.6
phone 7.4
inside 7.4
chair 7.3

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

wall 96.5
clothing 96
person 94.4
black and white 89.7
indoor 88.5
text 86.7
human face 55.1

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 90.4%
Calm 90.6%
Surprised 3.8%
Sad 3.6%
Fear 0.6%
Angry 0.4%
Disgusted 0.4%
Happy 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Hat 55.7%

Captions

Microsoft

a group of people performing on a counter 73.8%
a group of people wearing costumes 60.4%
a group of people performing on a counter top 60.3%

Text analysis

Amazon

TOPS
VOTED TOPS
VOTED
I
AC
CHESTERRIELS
the
The