Human Generated Data

Title

Untitled (three women sitting on floor and chatting while holding book)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10024

Human Generated Data

Title

Untitled (three women sitting on floor and chatting while holding book)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10024

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Chair 100
Furniture 100
Person 99.4
Human 99.4
Person 98.6
Person 97.7
Sitting 88.7
Shorts 82.1
Clothing 82.1
Apparel 82.1
LCD Screen 69.7
Electronics 69.7
Monitor 69.7
Screen 69.7
Display 69.7
Senior Citizen 66.6
Female 64.1
Portrait 61.4
Photography 61.4
Face 61.4
Photo 61.4
Couch 59.3

Clarifai
created on 2023-10-27

people 99.8
group 99.3
adult 97.1
man 96.8
wear 96.7
woman 96.4
child 95.9
monochrome 93.8
music 92.8
portrait 92.1
actor 91.7
dancer 91.5
dancing 91.3
group together 89
five 88.7
theater 88.7
education 88.5
three 86.2
school 85.7
singer 85.2

Imagga
created on 2022-01-28

person 43.7
people 35.1
man 31.6
adult 26.5
male 26.2
senior 21.5
couple 20
portrait 16.8
indoors 16.7
men 16.3
happy 16.3
patient 16
women 15.8
love 15.8
teacher 15
together 14.9
elderly 14.3
mature 13.9
sitting 13.7
face 13.5
nurse 13.2
smiling 13
lifestyle 13
clothing 12.9
home 12.8
professional 12.3
group 11.3
old 11.1
room 11.1
happiness 11
surgeon 10.9
retired 10.7
retirement 10.6
lady 10.5
office 10.4
kin 10.3
casual 10.2
indoor 10
planner 10
smile 10
case 9.9
team 9.8
cheerful 9.7
human 9.7
business 9.7
medical 9.7
older 9.7
looking 9.6
world 9.6
groom 9.5
meeting 9.4
sick person 9.1
health 9
bride 9
color 8.9
spectator 8.9
businessman 8.8
60s 8.8
holiday 8.6
model 8.5
adults 8.5
doctor 8.5
mother 8.3
book 8.2
one 8.2
husband 7.8
bonding 7.8
education 7.8
class 7.7
attractive 7.7
hand 7.6
enjoying 7.6
togetherness 7.5
relationship 7.5
camera 7.4
work 7.2
aged 7.2
black 7.2
suit 7.2
religion 7.2
grandfather 7.2
white 7.1
hospital 7.1
working 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 97.8
text 91.5
clothing 78.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 80.6%
Sad 48.3%
Happy 23.5%
Calm 20.1%
Confused 2.9%
Fear 1.7%
Disgusted 1.4%
Angry 1.1%
Surprised 1%

AWS Rekognition

Age 34-42
Gender Female, 55.2%
Fear 51.2%
Surprised 34.2%
Sad 10.2%
Calm 1.5%
Happy 1.1%
Angry 0.9%
Confused 0.5%
Disgusted 0.4%

AWS Rekognition

Age 30-40
Gender Male, 79.7%
Happy 79.4%
Surprised 12.3%
Calm 6%
Sad 0.5%
Angry 0.5%
Disgusted 0.5%
Confused 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 98.6%
Person 97.7%

Text analysis

Google

MJI7-- Y T37A°2 - - NAGON
MJI7--
Y
T37A°2
-
NAGON