Human Generated Data

Title

Untitled (large group of women at tea)

Date

1949

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19357

Human Generated Data

Title

Untitled (large group of women at tea)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.5
Human 98.5
Person 97.7
Person 97.1
Person 95.2
Room 92.9
Indoors 92.9
Apparel 92.5
Clothing 92.5
Person 91.2
Dressing Room 87.5
Person 74.7
Person 73.9
Person 73.6
People 72.6
Person 69.6
Crowd 66.1
Overcoat 65.6
Suit 65.6
Coat 65.6
Furniture 63.1
Photography 62.9
Photo 62.9
Person 62.9
Female 61.5
Person 59.8
Person 59.1
Table 58.7
Workshop 57
Clinic 55.8
Fashion 55.5
Gown 55.5
Evening Dress 55.5
Robe 55.5
Person 49.8

Imagga
created on 2022-03-05

salon 47.3
interior 30.1
table 29.7
people 29
shop 27.7
room 26
indoors 22.8
man 20.2
women 19.8
home 19.1
adult 18.1
indoor 17.3
person 17.3
sitting 17.2
shoe shop 16.6
male 16.3
mercantile establishment 16
business 15.8
modern 15.4
glass 15.2
case 14.9
happy 13.8
lifestyle 13.7
window 13.2
inside 12.9
restaurant 12.8
two 12.7
house 12.5
chair 12.5
holding 12.4
men 12
furniture 12
hairdresser 11.6
life 11.4
light 11.4
couple 11.3
group 11.3
design 11.3
boutique 11.1
kitchen 11
work 11
place of business 11
elegance 10.9
smiling 10.8
dress 10.8
family 10.7
decor 10.6
luxury 10.3
fashion 9.8
working 9.7
together 9.6
celebration 9.6
party 9.5
happiness 9.4
architecture 9.4
dinner 9.3
wedding 9.2
barbershop 9
style 8.9
office 8.8
professional 8.8
clothing 8.7
decoration 8.7
elegant 8.6
dining 8.6
smile 8.5
casual 8.5
glasses 8.3
suit 8.1
domestic 8
love 7.9
urban 7.9
food 7.9
bride 7.7
sofa 7.7
comfortable 7.6
living 7.6
drink 7.5
doctor 7.5
enjoyment 7.5
wine 7.5
mature 7.4
counter 7.4
shopping 7.3
occupation 7.3
cheerful 7.3
looking 7.2
hospital 7.2
romance 7.1
businessman 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96
black and white 92
clothing 85.3
text 85
table 74.3
clothes 25.3

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 70.6%
Calm 100%
Happy 0%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 92%
Calm 98.5%
Sad 1.1%
Confused 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 25-35
Gender Female, 61%
Calm 71.8%
Happy 15.6%
Sad 10.9%
Surprised 0.7%
Confused 0.5%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99%
Calm 99%
Surprised 0.4%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Happy 0.1%
Confused 0.1%
Sad 0.1%

AWS Rekognition

Age 45-53
Gender Male, 99.6%
Calm 31.5%
Confused 19.1%
Sad 14.5%
Angry 11.2%
Disgusted 8.3%
Surprised 6.9%
Happy 6%
Fear 2.5%

AWS Rekognition

Age 21-29
Gender Male, 84.7%
Fear 46.6%
Happy 29.9%
Sad 8.9%
Calm 7%
Angry 3.3%
Surprised 1.7%
Disgusted 1.7%
Confused 0.9%

AWS Rekognition

Age 24-34
Gender Female, 58.8%
Confused 55.5%
Disgusted 12.5%
Fear 10.7%
Surprised 7.4%
Calm 6.6%
Sad 5.5%
Angry 0.9%
Happy 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a group of people in a room 94.6%
a group of people standing next to a window 84.9%
a group of people standing in a room 84.8%

Text analysis

Amazon

23
KODAKA