Human Generated Data

Title

Untitled (three women looking at book while sitting on floor)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10025

Human Generated Data

Title

Untitled (three women looking at book while sitting on floor)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10025

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.6
Human 99.6
Person 99.4
Person 99.3
Sitting 98.2
Furniture 83.2
Clothing 80.5
Apparel 80.5
Room 65.3
Indoors 65.3
Couch 64.1
Flooring 60.3
Clinic 60.3
Senior Citizen 59.3
Chair 58

Clarifai
created on 2023-10-27

people 99.9
group 99.2
adult 98.8
wear 98.6
woman 98.5
man 97.8
monochrome 97.5
three 97.1
medical practitioner 96.3
uniform 96.3
child 94.9
two 94.1
group together 94
five 93.8
four 93.4
outfit 89.4
actor 89.2
education 87.5
several 87.3
veil 85

Imagga
created on 2022-01-28

shower cap 48.2
cap 39.1
person 33
people 32.3
couple 31.3
headdress 29.7
man 27.5
adult 27.4
clothing 26.3
happy 26.3
senior 26.2
male 24.8
portrait 23.9
patient 23.7
love 22.1
nurse 20.9
happiness 19.6
bride 19.2
women 18.2
smiling 18.1
elderly 17.2
lifestyle 16.6
wedding 16.5
smile 16.4
case 16.3
together 15.8
dress 15.3
married 15.3
indoors 14.9
mature 14.9
sick person 14.7
sitting 14.6
men 14.6
kin 14.1
home 13.5
casual 13.5
human 13.5
medical 13.2
camera 12.9
groom 12.7
retirement 12.5
husband 12.4
cheerful 12.2
outdoors 11.9
two 11.8
health 11.8
marriage 11.4
face 11.4
bouquet 11.3
attractive 11.2
old 11.1
gown 10.7
two people 10.7
romantic 10.7
older 10.7
wife 10.4
day 10.2
hospital 10.2
world 10.1
consumer goods 10
covering 9.9
family 9.8
professional 9.7
retired 9.7
flowers 9.6
enjoying 9.5
color 9.4
doctor 9.4
care 9
one 9
lady 8.9
looking 8.8
60s 8.8
holiday 8.6
togetherness 8.5
pretty 8.4
summer 8.3
joy 8.3
indoor 8.2
child 8.2
hair 7.9
seniors 7.8
veil 7.8
bonding 7.8
middle aged 7.8
outdoor 7.6
active 7.6
loving 7.6
fashion 7.5
friends 7.5
relaxed 7.5
fun 7.5
holding 7.4
mother 7.4
girls 7.3
grandfather 7.2
romance 7.1
worker 7.1

Google
created on 2022-01-28

Furniture 93.2
Sleeve 85.5
Chair 81.9
Tints and shades 74.9
Vintage clothing 74.8
Snapshot 74.3
Classic 73.2
Event 71.3
Hat 68.5
Tartan 67.4
Sitting 66.8
Monochrome 65.9
Monochrome photography 65.8
Stock photography 65.6
History 64.6
Rectangle 64.4
Room 64.1
Plaid 56.5
Pattern 55.3

Microsoft
created on 2022-01-28

text 97.8
person 97.6
racket 96
clothing 96
player 79
woman 67.3
old 44.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 55.1%
Fear 48.5%
Surprised 45.6%
Sad 2.3%
Happy 1.9%
Confused 0.6%
Angry 0.5%
Calm 0.4%
Disgusted 0.4%

AWS Rekognition

Age 40-48
Gender Male, 79.7%
Happy 75.7%
Calm 15.2%
Sad 2.9%
Confused 2.1%
Disgusted 1.7%
Surprised 0.9%
Angry 0.8%
Fear 0.8%

AWS Rekognition

Age 35-43
Gender Female, 99.7%
Sad 70.4%
Calm 24.5%
Happy 4.2%
Angry 0.4%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.4%
Person 99.3%

Text analysis

Google

MJI7-- YT37A°2-- NAGON
MJI7--
YT37A°2--
NAGON