Human Generated Data

Title

Untitled (woman and two small children posed looking at baby in living room)

Date

1936

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9116

Human Generated Data

Title

Untitled (woman and two small children posed looking at baby in living room)

People

Artist: Martin Schweig, American 20th century

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9116

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.1
Human 99.1
Person 97.2
Person 96.6
Person 96.5
Clothing 94.2
Apparel 94.2
Room 72.9
Indoors 72.9
Clinic 61.3
Furniture 59.7
Bonnet 58.7
Hat 58.7
Art 58

Clarifai
created on 2023-10-27

people 99.7
monochrome 99.1
group 98.6
child 96.9
wedding 95.1
woman 94.6
adult 93.5
wear 93
man 92.7
veil 91.5
nostalgia 90
three 90
princess 90
family 88.1
indoors 87.9
sit 86.8
theater 86.7
portrait 86.5
art 85.4
music 85.3

Imagga
created on 2022-01-23

negative 33
film 31.8
photographic paper 23.9
shower cap 22.9
people 22.8
cap 19.6
person 19.2
bride 16.3
man 16.1
photographic equipment 15.9
clothing 15.7
portrait 15.5
dress 14.4
adult 14.3
love 14.2
television 14.1
headdress 14
face 13.5
hair 13.5
happy 13.1
wedding 12.9
black 12.6
male 12
looking 12
lifestyle 11.6
old 11.1
human 10.5
equipment 10.1
gown 9.8
surgeon 9.4
smile 9.3
sculpture 9.2
hospital 9
religion 9
worker 8.9
medical 8.8
veil 8.8
symbol 8.7
art 8.7
married 8.6
statue 8.6
groom 8.4
health 8.3
passenger 8.3
room 8.3
back 8.3
one 8.2
home 8
holiday 7.9
eyes 7.7
attractive 7.7
mask 7.7
hand 7.6
monitor 7.6
aquarium 7.5
doctor 7.5
senior 7.5
telecommunication system 7.4
care 7.4
camera 7.4
x-ray film 7.2
celebration 7.2
marble 7.1
working 7.1
happiness 7
medicine 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
wedding dress 93.6
human face 91
window 90.9
person 90.4
clothing 90.1
bride 81.4
woman 79.4
dress 73.8
old 46.3
picture frame 9.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-53
Gender Female, 99%
Calm 99.7%
Sad 0.1%
Happy 0.1%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 14-22
Gender Male, 59.4%
Happy 96.5%
Calm 2.8%
Surprised 0.2%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Confused 0%

AWS Rekognition

Age 6-16
Gender Male, 97.5%
Calm 98.8%
Happy 0.5%
Surprised 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%
Angry 0.1%

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

paintings art 98.8%

Text analysis

Amazon

ECCOE