Human Generated Data

Title

Untitled (three women sitting on floor chatting while holding large book)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10026

Human Generated Data

Title

Untitled (three women sitting on floor chatting while holding large book)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Apparel 99.7
Clothing 99.7
Human 99.1
Person 99.1
Person 98.6
Person 94
Female 94
Woman 85.3
Dress 73.7
Shorts 71
Skirt 67.3
Couch 66.3
Furniture 66.3
Gown 65.6
Evening Dress 65.6
Fashion 65.6
Robe 65.6
Girl 62.5
Teen 57.8
Kid 57.8
Child 57.8
Blonde 57.8

Imagga
created on 2022-01-28

person 40
man 38.3
people 34.6
male 30.5
adult 27.9
senior 27.2
patient 23.8
nurse 23.2
couple 20
smiling 19.5
women 19
sitting 18.9
lifestyle 18.1
mature 17.7
indoors 17.6
happy 17.5
together 17.5
love 17.3
home 15.9
teacher 15.8
portrait 15.5
businessman 15
happiness 14.9
men 14.6
business 14.6
retirement 14.4
clothing 14.3
face 14.2
sick person 14
table 13.8
room 13.8
case 13.7
husband 13.5
kin 12.9
elderly 12.4
groom 12
casual 11.9
retired 11.6
cheerful 11.4
meeting 11.3
office 11.2
professional 11.2
mother 10.9
team 10.7
holding 10.7
medical 10.6
old 10.4
wife 10.4
looking 10.4
planner 10
dress 9.9
family 9.8
lady 9.7
health 9.7
two people 9.7
30s 9.6
bride 9.6
married 9.6
executive 9.5
color 9.4
laptop 9.3
two 9.3
computer 8.9
interior 8.8
working 8.8
40s 8.8
older 8.7
education 8.7
talking 8.5
expression 8.5
togetherness 8.5
work 8.3
wedding 8.3
20s 8.2
book 8.2
indoor 8.2
educator 8.2
group 8.1
handsome 8
worker 8
to 8
hospital 7.9
look 7.9
smile 7.8
colleagues 7.8
mid adult 7.7
reading 7.6
fashion 7.5
doctor 7.5
camera 7.4
grandfather 7.3
world 7.3
job 7.1
day 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 98.4
clothing 94.9
text 93.4
dress 84.7
woman 76.5
human face 74
sport 67

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 99.6%
Happy 59.5%
Surprised 17.4%
Fear 14.3%
Calm 3.7%
Angry 1.7%
Sad 1.5%
Confused 1.1%
Disgusted 0.8%

AWS Rekognition

Age 48-54
Gender Male, 60%
Happy 56%
Calm 17.1%
Surprised 7%
Sad 6.6%
Confused 5.7%
Disgusted 3.6%
Angry 2%
Fear 2%

AWS Rekognition

Age 48-54
Gender Male, 94.8%
Calm 90.1%
Happy 7.8%
Sad 0.7%
Confused 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people standing in a room 90.2%
a group of people around each other 80.1%
a group of people in a room 80%

Text analysis

Google

MJI7-- YT37A°2 - - NAGON
YT37A°2
NAGON
MJI7--
-