Human Generated Data

Title

Untitled (people at buffet table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17202

Human Generated Data

Title

Untitled (people at buffet table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Person 99.4
Person 98.3
Person 97.3
Person 96.5
Food 96.1
Meal 96.1
Person 95.9
Person 95.7
Apparel 94.8
Clothing 94.8
Dish 88.6
Indoors 81.5
Room 77.4
People 76.2
Female 66.6
Restaurant 66.3
Cafeteria 66.3
Plant 61.9
Table 61.8
Furniture 61.8
Girl 60.9
Linen 59.7
Home Decor 59.7
Buffet 58.9
Suit 58.6
Overcoat 58.6
Coat 58.6
Flower 58.4
Blossom 58.4
Dressing Room 55.8
Crowd 55.4

Imagga
created on 2022-02-26

plastic bag 39.5
man 34.9
bag 31.1
people 29
person 26.3
male 24.1
container 24.1
couple 23.5
happy 22.5
bride 18.2
love 18.1
wedding 16.5
happiness 16.4
senior 15.9
adult 15.7
groom 15.6
seller 15.6
marriage 15.2
smiling 14.5
married 14.4
old 13.9
home 13.5
wife 13.3
together 13.1
smile 12.8
celebration 12.8
husband 12.4
men 12
two 11.8
health 11.8
elderly 11.5
medical 11.5
human 11.2
portrait 11
work 11
joy 10.8
suit 10.8
holiday 10.7
hand 10.6
doctor 10.3
sitting 10.3
women 10.3
mask 10
worker 9.9
dress 9.9
cheerful 9.7
medicine 9.7
flowers 9.6
day 9.4
tradition 9.2
professional 9.1
bouquet 9.1
room 9.1
holding 9.1
outdoors 8.9
spectator 8.9
costume 8.9
to 8.8
beard 8.8
father 8.8
bridal 8.7
ceremony 8.7
lifestyle 8.7
life 8.6
retirement 8.6
enjoying 8.5
nurse 8.3
care 8.2
active 8.2
kin 8.2
indoors 7.9
veil 7.8
surgery 7.8
surgeon 7.6
rose 7.5
fun 7.5
leisure 7.5
hospital 7.4
equipment 7.4
cute 7.2
stall 7.1
family 7.1
working 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.4
clothing 92.3
people 75.1
black and white 74.3
text 72.3
man 69.8
group 66.1
old 43.1
crowd 0.5

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 85%
Calm 96.5%
Sad 2%
Happy 0.6%
Angry 0.3%
Surprised 0.2%
Fear 0.2%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 26-36
Gender Male, 61.5%
Calm 63.3%
Fear 18.8%
Happy 5.3%
Sad 4.9%
Surprised 4.2%
Disgusted 1.6%
Angry 1.1%
Confused 0.8%

AWS Rekognition

Age 14-22
Gender Female, 98.1%
Calm 83.3%
Surprised 7.1%
Happy 6.3%
Angry 1.4%
Sad 0.9%
Disgusted 0.6%
Confused 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people standing in a room 97.1%
a group of people standing around each other 92.6%
a group of people standing in an old photo of a person 88.2%