Human Generated Data

Title

Untitled (people sitting at tables)

Date

1959

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18972

Human Generated Data

Title

Untitled (people sitting at tables)

People

Artist: Bachrach Studios, founded 1868

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Human 99.2
Person 99.2
Person 98.5
Person 98.4
Person 96.7
Dress 95.9
Female 93
Person 92.2
Accessories 92
Tie 92
Accessory 92
Furniture 91.4
Chair 91.4
Person 90.3
Person 87.7
Woman 80.9
Costume 78.8
Person 78
Face 71.4
Skirt 71.2
Girl 70.8
People 68.9
Hat 62.7
Portrait 61.8
Photo 61.8
Photography 61.8
Footwear 58.3
Shoe 58.3
Performer 57
Shoe 55.4

Imagga
created on 2022-03-05

kin 60.1
people 26.2
man 23.5
male 20.5
person 20
cap 19.7
happy 17.5
adult 16.2
shower cap 16.2
clothing 14.6
women 14.2
portrait 14.2
holiday 13.6
nurse 13.4
happiness 13.3
men 12.9
hat 12.7
headdress 12.4
medical 12.3
senior 12.2
couple 11.3
human 11.2
sitting 11.2
child 11.1
family 10.7
salon 10.5
fun 10.5
health 10.4
doctor 10.3
love 10.2
team 9.8
worker 9.8
bride 9.6
celebration 9.6
work 9.5
costume 9.4
negative 9.1
old 9
professional 9
suit 9
group 8.9
smiling 8.7
winter 8.5
studio 8.3
fashion 8.3
wedding 8.3
sibling 8.2
mask 8
medicine 7.9
seasonal 7.9
black 7.8
gown 7.8
attractive 7.7
clinic 7.7
outdoor 7.6
uniform 7.6
groom 7.5
friendship 7.5
outdoors 7.5
covering 7.4
instrument 7.4
teamwork 7.4
equipment 7.3
present 7.3
dress 7.2
film 7.2
religion 7.2
patient 7.1
to 7.1
indoors 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 97.9
clothing 92.9
sport 86.7
dress 86.5
woman 81.2
standing 80.3
text 79.2
footwear 77.1
dancer 75.7
posing 41.7
female 32.4

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 79.4%
Surprised 87.8%
Fear 6%
Calm 4%
Disgusted 0.6%
Angry 0.6%
Sad 0.5%
Confused 0.4%
Happy 0.1%

AWS Rekognition

Age 34-42
Gender Male, 95.2%
Surprised 99.4%
Happy 0.4%
Angry 0.1%
Calm 0.1%
Fear 0%
Disgusted 0%
Confused 0%
Sad 0%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Surprised 47.8%
Calm 25.4%
Confused 12.6%
Sad 7.9%
Happy 2.8%
Fear 1.6%
Disgusted 1.1%
Angry 0.8%

AWS Rekognition

Age 33-41
Gender Male, 99.6%
Calm 85.6%
Surprised 12.5%
Disgusted 0.6%
Sad 0.5%
Confused 0.3%
Angry 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 48-56
Gender Male, 86.4%
Calm 66.2%
Surprised 15.3%
Angry 6.7%
Confused 6.1%
Disgusted 2.3%
Sad 1.9%
Fear 0.8%
Happy 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Tie 92%
Shoe 58.3%

Captions

Microsoft

a group of people posing for a photo 90.3%
a group of people posing for the camera 90.2%
a group of people posing for a picture 90.1%

Text analysis

Amazon

Mrs
24
G
Jr
/59
5/
Carney
Mrs G K Carney Jr 5/ /59 (متط)
KO
K
(متط)