Human Generated Data

Title

Untitled (three women chatting at large food table in fancy dining room with two others in doorway)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9650

Human Generated Data

Title

Untitled (three women chatting at large food table in fancy dining room with two others in doorway)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Food 98.9
Meal 98.9
Person 98.6
Person 98.1
Person 97.2
Person 95.7
People 88
Indoors 84.1
Room 83.1
Clothing 80
Apparel 80
Table 78
Furniture 78
Restaurant 73
Cafeteria 73
Portrait 67.1
Face 67.1
Photo 67.1
Photography 67.1
Female 65
Art 63.1
Painting 63.1
Dish 61
Dining Table 59.3
Buffet 59.3
Dining Room 58.7
Tabletop 57.1
Dress 56.7

Imagga
created on 2022-01-23

man 25.1
people 24
male 22.7
couple 20
person 18.2
business 16.4
businessman 15.9
men 15.5
women 14.2
adult 11.8
team 11.6
group 11.3
happy 11.3
old 11.1
groom 10.9
sitting 10.3
blackboard 10.3
work 10.3
love 10.3
black 10.2
silhouette 9.9
light 9.4
two 9.3
professional 9.3
cheerful 8.9
husband 8.8
smiling 8.7
finance 8.4
office 8.2
bride 8.1
symbol 8.1
religion 8.1
success 8
family 8
table 8
job 8
kin 7.8
day 7.8
portrait 7.8
grunge 7.7
money 7.7
marriage 7.6
power 7.6
meeting 7.5
senior 7.5
human 7.5
film 7.4
occupation 7.3
design 7.3
businesswoman 7.3
suit 7.2
home 7.2
negative 7.1
happiness 7.1
child 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.6
clothing 91.1
person 89
woman 76.8
old 62.3
drawing 54.2

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Surprised 96.5%
Happy 1%
Calm 1%
Fear 0.6%
Sad 0.5%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 50-58
Gender Female, 79.6%
Sad 56.8%
Happy 26.5%
Confused 8.8%
Calm 2.2%
Fear 1.9%
Angry 1.4%
Surprised 1.3%
Disgusted 1.2%

AWS Rekognition

Age 24-34
Gender Male, 54.6%
Happy 57.3%
Sad 18.4%
Disgusted 7.5%
Calm 4.6%
Confused 4.6%
Surprised 3.2%
Angry 2.2%
Fear 2.2%

AWS Rekognition

Age 36-44
Gender Male, 97.8%
Calm 98.8%
Sad 0.3%
Happy 0.3%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Painting 63.1%

Captions

Microsoft

a group of people standing in front of a window 71.8%
an old photo of a group of people in a room 71.7%
a group of people posing for a photo 71.6%

Text analysis

Amazon

23213
KODAK---IW

Google

पाएछा
2
3
213
पाएछा 2 3 213